[399] **viXra:1705.0477 [pdf]**
*submitted on 2017-05-31 14:17:35*

**Authors:** George R. Briggs

**Comments:** 3 Pages.

Abstract: The cosmological constant comes in holographic and non-Holographic versions each of which supports Friedmann"s equations concerning the matter density of the universe in different ways and 8 fold supersymmetry and cyclic universe E8 symmetry theory in general

**Category:** Relativity and Cosmology

[398] **viXra:1705.0472 [pdf]**
*submitted on 2017-05-31 11:15:55*

**Authors:** Terubumi Honjou

**Comments:** 3 Pages.

The dark energy density of the accelerating expanding universe is considered constant.
The dark energy density of the accelerating expanding universe is considered constant.
It means that the dark energy continues to be supplied to the expanding space from somewhere.
Dark energy is considered to be a constant, lambda (λ) of the gravitational equation that Einstein removed after adding.
The energy value of the lambda is desired to be zero.
In the elementary particle pulsation hypothesis, the average density of dark energy is set to zero as the first assumption.
When the universe expands and the dark energy density thins, the energy value of the lambda is always zero.
It is expressed as the horizontal line of the energy waveform diagram.
It can be said that the pulsating hypothesis is a deviation from the horizontal line, and physical of the wave shot.
The horizon is assumed to be energy zero regardless of whether the dark energy continues to be supplied from the surplus dimension.

**Category:** Astrophysics

[397] **viXra:1705.0470 [pdf]**
*submitted on 2017-05-31 08:41:13*

**Authors:** Renzun Lian

**Comments:** 20 Pages.

In the Part I of Line-Surface formulation of the ElectroMagnetic-Power-based Characteristic Mode Theory for Metal-Material combined objects (LS-MM-EMP-CMT), the relevant fundamental principle had been established, and some very valuable complements and improvements are done in this Part II.
In this Part II, the traditional surface equivalent principle for a homogeneous material body whose boundary is only constructed by a closed surface is generalized to the line-surface equivalent principle of a homogeneous material body whose boundary can include some lines and open surfaces besides a closed surface; a new line-surface formulation of the input/output power operator for metal-material combined objects is given, and the new formulation is more advantageous than the formulation given in Part I; some more detailed formulations for establishing LS-MM-EMP-CMT are explicitly provided here, such as the formulations corresponding to the decompositions for currents and their domains and the formulations corresponding to variable unification.
In addition, a new concept intrinsic resonance is introduced in this paper, and then a new Characteristic Mode (CM) set, intrinsic resonant CM set, is introduced into the EMP-CMT family.

**Category:** Mathematical Physics

[396] **viXra:1705.0469 [pdf]**
*submitted on 2017-05-31 06:24:06*

**Authors:** Miguel A. Sanchez-Rey

**Comments:** 5 Pages.

A foreseeable task.

**Category:** General Science and Philosophy

[395] **viXra:1705.0468 [pdf]**
*replaced on 2017-05-31 08:52:43*

**Authors:** Ilija Barukčić

**Comments:** Pages.

Bell's theorem is mathematically and logically inconsistent, just a logical fallacy. A serious experiment cannot confirm the logical consistency of somehting which is logically inconsistent.

**Category:** Quantum Physics

[394] **viXra:1705.0467 [pdf]**
*replaced on 2017-06-01 23:38:49*

**Authors:** Stephen P. Smith, Cambrian Lopez, Nicole Lam

**Comments:** 12 Pages.

Smith, Lopez and Lam described how to combine genetic similarities, measured in centimorgans (cM), among declared relatives in an outside pedigree, and to concentrate those cM values into a single cM measurement for an envoy that is a representative of the outside pedigree. An unknown relative is presumed to be a descendant of the envoy, but has the cM values with relatives in the outside pedigree. That prior effort was a univariate analysis, where there is only one unknown relative with matches with others in the outside pedigree. The present paper presents a bivariate analysis, where there are two sisters that have matches with others in the outside pedigree. The cM values are now paired, where any DNA tested member of the pedigree has two cM values that match to both sisters. The bivariate analysis offers more efficient use of information, compared to two univariate analyses done for each sister in turn. This advantage comes with an increase in model complexity, in that a model is developed for treating three mutually exclusive categories representing genes found in the sisters: for genes in the first sister but not in the second sister; genes common to both sisters; or genes in the second sister but not in the first. The model is applied to the inheritance of the cM values in the pedigree. Even though the number of random effects is increased by a factor of three, the number of fixed effects that actually spend two degrees of freedom is unchanged from the univariate analysis. This is on top of the doubling of the number of observations for the bivariate analysis compared to one univariate analysis.

**Category:** Quantitative Biology

[393] **viXra:1705.0466 [pdf]**
*replaced on 2017-05-31 08:55:02*

**Authors:** Liu Ran

**Comments:** 5 Pages. 增加了暗物质暗能量的解释

无意中看到伦敦大学物理学家David Bohm提出的宇宙全像式模型理论，和神创世界假说有惊人的相似地方，我就把多年前的文稿找出来发布，希望一起加入讨论。

**Category:** History and Philosophy of Physics

[392] **viXra:1705.0465 [pdf]**
*submitted on 2017-05-29 13:33:55*

**Authors:** Peter Bissonnet

**Comments:** 1 Page.

Does Cause and Effect automatically imply Conservation of Energy? This paper takes an extremely simplistic example and shows that there is an apparent inconsistency on the macroscopic level.

**Category:** Classical Physics

[391] **viXra:1705.0464 [pdf]**
*submitted on 2017-05-29 17:51:51*

**Authors:** Jeffrey Joseph Wolynski

**Comments:** 1 Page. 1 illustrative graph

The vast majority of polycyclic aromatic hydrocarbons (PAH)’s are formed in intermediate stages of a star’s evolution. This is in line with the astrochemical principle of stellar metamorphosis. Stars in intermediate stages of evolution are outlined in a graph and are comprised of M, L, Y and T type brown dwarfs all the way to ocean world stages of stellar evolution. PAH’s are mostly formed in Pop 2 stars according to the General Theory of Stellar Metamorphosis.

**Category:** Astrophysics

[390] **viXra:1705.0463 [pdf]**
*submitted on 2017-05-30 05:50:38*

**Authors:** Ramesh Chandra Bagadi

**Comments:** 14 Pages.

In this research investigation, the author has presented a Recursive Past Equation and a Recursive Future Equation based on the Ananda-Damayanthi Normalized Similarity Measure considered to Exhaustion [1].

**Category:** Statistics

[389] **viXra:1705.0462 [pdf]**
*submitted on 2017-05-29 12:01:22*

**Authors:** Radhakrishnamurty Padyala

**Comments:** 5 Pages.

Kelvin, one of the founders of thermodynamics, proposed an economical, thermodynamic method to heat houses. The method employs a combination of two Carnot heat engines. One engine runs in clockwise direction while the other runs in counterclockwise direction. This combination is claimed to provide much more heat into the house for a given amount of fuel used, compared to that obtained through burning that fuel inside the house. The ratio of the two heats, one obtained by Kelvin’s method and the other obtained by the burning the fuel inside the house, is known as heat multiplication factor (HMF). This factor could theoretically be quite high (a typical calculation gives more than a factor of 6). We show in this note that Kelvin’s method is fallacious - it is impossible to get any more heat by using Kelvin’s method than the heat that could be obtained from combustion (burning) of the fuel.

**Category:** Thermodynamics and Energy

[388] **viXra:1705.0461 [pdf]**
*submitted on 2017-05-29 12:33:16*

**Authors:** Edgar Valdebenito

**Comments:** 12 Pages.

This note presents some formulas and fractals related with the equation : x^3+x^2+1=0.

**Category:** Number Theory

[387] **viXra:1705.0460 [pdf]**
*replaced on 2018-02-13 14:27:15*

**Authors:** John Yuk Ching Ting

**Comments:** 68 Pages. Targeting the General Public - Rigorous proofs for Riemann hypothesis, Polignac's and Twin prime conjectures

L-functions form an integral part of the 'L-functions and Modular Forms Database' with far-reaching implications. In perspective, Riemann zeta function is the simplest example of an L-function. Riemann hypothesis refers to the 1859 proposal by Bernhard Riemann whereby all nontrivial zeros are [mathematically] conjectured to lie on the critical line of this function. This proposal is equivalently stated in this research paper as all nontrivial zeros are [geometrically] conjectured to exactly match the 'Origin' intercepts of this function. Deeply entrenched in number theory, prime number theorem entails analysis of prime counting function for prime numbers. Solving Riemann hypothesis would enable complete delineation of this important theorem. Involving proposals on the magnitude of prime gaps and their associated sets of prime numbers, Twin prime conjecture deals with prime gap = 2 (representing twin primes) and is thus a subset of Polignac's conjecture which deals with all even number prime gaps = 2, 4, 6,... (representing prime numbers in totality except for the first prime number '2'). Both nontrivial zeros and prime numbers are Incompletely Predictable entities allowing us to employ our novel Virtual Container Research Method to solve the associated hypothesis and conjectures.

**Category:** Number Theory

[386] **viXra:1705.0448 [pdf]**
*submitted on 2017-05-29 08:31:20*

**Authors:** Said Broumi, Assia Bakali, Mohamed Talea, Florentin Smarandache

**Comments:** 7 Pages.

In this study, we propose an approach to determine the shortest path length between a pair of specified nodes s and t on a network whose edge weights are represented by trapezoidal neutrosophic numbers. Finally, an illustrative example is provided to show the applicability and effectiveness of the proposed approach.

**Category:** General Mathematics

[385] **viXra:1705.0446 [pdf]**
*submitted on 2017-05-29 08:33:21*

**Authors:** Florentin Smarandache

**Comments:** 17 Pages.

This paper introduces two new fusion rules for combining quantitative basic belief assignments.
These rules although very simple have not been proposed in literature so far and could serve as useful alternatives because of their low computation cost with respect to the recent advanced Proportional Conflict Redistribution rules developed in the DSmT framework.

**Category:** General Mathematics

[384] **viXra:1705.0437 [pdf]**
*submitted on 2017-05-29 08:48:19*

**Authors:** Akbar Rezaei, Arsham Borumand Saeid, Florentin Smarandache

**Comments:** 15 Pages.

In this paper, we introduce the notion of (implicative) neutrosophic filters in BE-algebras. The relation between implicative neutrosophic filters and neutrosophic filters is investigated and we show that in self distributive BEalgebras
these notions are equivalent.

**Category:** General Mathematics

[383] **viXra:1705.0434 [pdf]**
*submitted on 2017-05-29 08:53:23*

**Authors:** Said Broumi, Mohamed Talea, Florentin Smarandache, Assia Bakali, Luige Vladareanu

**Comments:** 6 Pages.

In this work, a neutrosophic network method is
proposed for finding the shortest path length with single valued trapezoidal neutrosophic number. The proposed algorithm gives the shortest path length using score function from source node to destination node. Here the weights of the edges are considered to be single valued trapezoidal neutrosophic number. Finally, a numerical example is used to illustrate the efficiency of the proposed approach.

**Category:** General Mathematics

[382] **viXra:1705.0432 [pdf]**
*submitted on 2017-05-29 08:56:53*

**Authors:** Said Broumi, Mohamed Talea, Florentin Smarandache, Assia Bakali

**Comments:** 8 Pages.

In this paper, we first define the concept of bipolar single neutrosophic graphs as the
generalization of bipolar fuzzy graphs, N-graphs, intuitionistic fuzzy graph, single valued
neutrosophic graphs and bipolar intuitionistic fuzzy graphs.

**Category:** General Mathematics

[381] **viXra:1705.0431 [pdf]**
*submitted on 2017-05-29 08:57:42*

**Authors:** Said Broumi, Mohamed Talea, Florentin Smarandache, Assia Bakali

**Comments:** 6 Pages.

In this paper, the authors propose an extended
version of Dijkstra’ algorithm for finding the shortest path on a network where the edge weights are characterized by an interval valued neutrosophic numbers. Finally, a numerical example is given to explain the proposed algorithm.

**Category:** General Mathematics

[380] **viXra:1705.0430 [pdf]**
*submitted on 2017-05-29 08:58:40*

**Authors:** Ilanthenral Kandasamy, Florentin Smarandache

**Comments:** 8 Pages.

Personality tests are most commonly objective type, where the users rate their behaviour. Instead of providing a single forced choice, they can be provided with more options. A
person may not be in general capable to judge his/her behaviour very precisely and categorize it into a single category. Since it is self rating there is a lot of uncertain and indeterminate feelings involved.

**Category:** General Mathematics

[379] **viXra:1705.0428 [pdf]**
*submitted on 2017-05-29 09:00:23*

**Authors:** Florentin Smarandache

**Comments:** 10 Pages.

During the process of adaptation of a being (plant, animal, or human), to a new
environment or conditions, the being partially evolves, partially devolves (degenerates),
and partially is indeterminate {i.e. neither evolving nor devolving, therefore unchanged
(neutral), or the change is unclear, ambiguous, vague}, as in neutrosophic logic. Thank to
adaptation, one therefore has: evolution, involution, and indeterminacy (or neutrality),
each one of these three neutrosophic components in some degree.

**Category:** General Mathematics

[378] **viXra:1705.0426 [pdf]**
*submitted on 2017-05-29 09:02:36*

**Authors:** Dragisa STANUJKIC, Edmundas Kazimieras ZAVADSKAS, Florentin SMARANDACHE, Willem K.M. BRAUERS, Darjan KARABASEVIC

**Comments:** 12 Pages.

The aim of this manuscript is to propose a new extension of theMULTIMOORA method adapted for usage with a neutrosophic set. By using single valued neutrosophic sets, the MULTIMOORA method can be more efficient for solving complex problems whose solving requires assessment and prediction, i.e. those problems associated with inaccurate and unreliable data. The suitability of the proposed approach is presented through an example.

**Category:** General Mathematics

[377] **viXra:1705.0425 [pdf]**
*submitted on 2017-05-29 09:04:39*

**Authors:** Abdel-Nasser Hussian, Mai Mohamed, Mohamed Abdel-Baset, Florentin Smarandache

**Comments:** 6 Pages.

Smarandache presented neutrosophic theory as a tool for handling undetermined information, and together with Wang et al. introduced single valued neutrosophic sets that is a special neutrosophic set and can be used expediently to deal with real-world problems, especially in decision support. In this paper, we propose linear programming problems based on neutrosophic environment. Neutrosophic sets characterized by three independent parameters, namely truth-membership degree (T), indeterminacy-membership degree (I) and falsity-membership degree (F), which is more capable to handle imprecise parameters. We also transform the neutrosophic linear programming problem into a crisp programming model by using neutrosophic set parameters. To measure the efficiency of our proposed model we solved several numerical examples.

**Category:** General Mathematics

[376] **viXra:1705.0423 [pdf]**
*submitted on 2017-05-29 09:11:42*

**Authors:** Said Broumi, Mohamed Talea, Florentin Smarandache, Assia Bakali

**Comments:** 19 Pages.

In this article, we combine the concept of bipolar neutrosophic set and graph theory. We introduce the notions of bipolar single valued neutrosophic graphs, strong bipolar single valued neutrosophic graphs, complete bipolar single valued neutrosophic graphs, regular bipolar single valued neutrosophic graphs and investigate some of their related properties.

**Category:** General Mathematics

[375] **viXra:1705.0420 [pdf]**
*submitted on 2017-05-29 09:15:33*

**Authors:** Said Broumi, Mohamed Talea, Florentin Smarandache, Assia Bakali

**Comments:** 16 Pages.

The notion of single valued neutrosophic sets is a generalization of fuzzy sets, intuitionistic fuzzy sets. We apply the concept of single valued neutrosophic sets, an instance of neutrosophic sets, to graphs. We introduce certain types of single valued neutrosophic graphs (SVNG) and investigate some of their properties with proofs and examples.

**Category:** General Mathematics

[374] **viXra:1705.0418 [pdf]**
*submitted on 2017-05-29 09:17:27*

**Authors:** Said Broumi, Mohamed Talea, Florentin Smarandache, Assia Bakali

**Comments:** 8 Pages.

In this article, we extend the neutrosophic
graph-based multicriteria decision making method (NGMCDM) introduced by Sahin [49] for the case of interval valued neutrosophic graph theory. We also give an algorithm to solve decision making problems by using interval valued neutrosophic graphs. Finally, an illustrative example is given and a comparison analysis is conducted between the proposed approach and other existing methods, to verify the feasibility and effectiveness of the
developed approach.

**Category:** General Mathematics

[373] **viXra:1705.0414 [pdf]**
*submitted on 2017-05-29 09:22:13*

**Authors:** Kenta Takay, Toshinori Asai, Valeri Kroumov, Florentin Smarandache

**Comments:** 6 Pages.

In the process of development a control strategy for mobile robots, simulation is important for testing the software components, robot behavior and control algorithms in different surrounding environments. In this paper we introduce a simulation environment for mobile robots based on ROS and Gazebo.
We show that after properly creating the robot models under Gazebo, the code developed for the simulation process can be directly implemented in the real robot without modifications.
In this paper autonomous navigation tasks and 3D-mapping simulation using control programs under ROS are presented. Both the simulation and experimental results agree very well
and show the usability of the developed environment.

**Category:** General Mathematics

[372] **viXra:1705.0411 [pdf]**
*submitted on 2017-05-29 09:25:08*

**Authors:** Victor Christianto, Florentin Smarandache, Yunita Umniyati

**Comments:** 12 Pages.

It has been known for long time that the cosmic sound wave was there since the early epoch of the Universe. Signatures of its existence are abundant. However, such an acoustic model of cosmology is rarely developed fully into a complete framework from the notion of space up to the sky.

**Category:** General Mathematics

[371] **viXra:1705.0410 [pdf]**
*replaced on 2017-05-30 20:59:45*

**Authors:** Hong Lai Zhu

**Comments:** 71 Pages.

This is the first part of the total paper. Since the theory of partial differential equations (PDEs) has been established nearly 300 years, there are many important problems have not been resolved, such as what are the general solutions of Laplace equation, acoustic wave equation, Helmholtz equation, heat conduction equation, Schrodinger equation and other important equations? How to solve the problems of definite solutions which have universal significance for these equations? What are the laws of general solution of the mth-order linear PDEs with n variables (n,m≥2)? Is there any general rule for the solution of a PDE in arbitrary orthogonal coordinate systems? Can we obtain the general solution of vector PDEs? Are there very simple methods to quickly and efficiently solve the exact solutions of nonlinear PDEs? And even general solution? Etc. These problems are all effectively solved in this paper. Substituting the results into the original equations, we have verified that they are all correct.

**Category:** Functions and Analysis

[370] **viXra:1705.0409 [pdf]**
*submitted on 2017-05-28 15:37:41*

**Authors:** John R. Berryhill

**Comments:** 6 Pages. Discussion at unmysticalphysics.com

The calculation of lookback time and particle horizon in the Lambda-CDM model is simplified by use of an explicit formula for the cosmic expansion scale factor S(t). The present article continues the exposition begun in viXra 1704.0303.

**Category:** Relativity and Cosmology

[369] **viXra:1705.0408 [pdf]**
*submitted on 2017-05-28 18:35:40*

**Authors:** Evgeny A. Novikov

**Comments:** 8 Pages.

Quantum modification of general relativity (Qmoger) is supported by cosmic data (without fitting). Qmoger equations consist of Einstein equations with two additional terms responsible for production/absorption of matter. In Qmoger cosmology there was no Big Bang and matter is continuously producing by the Vacuum. Particularly, production of the ultralight gravitons with tiny electric dipole moment (EDM) was started about 284 billion years ago. Quantum effects dominate interaction of these particles and they form the quantum condensate. Under influence of gravitation, the condensate is forming galaxies and producing ordinary matter, including photons. As one important result of this activity, it recently created us, the people, and continues to support us. Particularly, our subjective experiences are a result of an interaction between the background condensate and the neural system of the brain. The action potentials of neural system create traps and coherent dynamic patterns in the dipolar condensate. So, our subjective experiences are graviton-based. Some problems with the origin of life can also be clarified by taking into account the background dipolar condensate. It seems natural to call this graviton condensate bright matter. It not only produced ordinary matter, including light, but also produced and nurturing conscious life, as we know it, and, perhaps, some other forms of life in the universe. EDM of gravitons is small and existing telescopes can not see them, but we actually see bright matter in our subjective experiences. So, cosmology and brain science must work together to investigate bright matter, which will be the most important enterprise of humankind,

**Category:** Relativity and Cosmology

[368] **viXra:1705.0407 [pdf]**
*submitted on 2017-05-28 23:37:47*

**Authors:** Ramesh Chandra Bagadi

**Comments:** 13 Pages.

In this research investigation, the author has presented a Recursive Past Equation and a Recursive Future Equation based on the Ananda-Damayanthi Normalized Similarity Measure considered to Exhaustion [1].

**Category:** Statistics

[367] **viXra:1705.0406 [pdf]**
*submitted on 2017-05-28 23:57:39*

**Authors:** Temur Z. Kalanov

**Comments:** 13 Pages.

The critical analysis of the generally accepted foundations of quantum mechanics is proposed. The purpose of the analysis is to prove that the foundations include logical errors. The principle of the unity of formal logic and of rational dialectics is a methodological basis of the analysis. The result is as follows: (a) the generally accepted foundations (i.e., the interpretation of the experimental data on diffraction of quantum particles; the conception of wave-corpuscle dualism; the probabilistic interpretation of the psi-function) are logical errors; (b) the pseudo-informational meaning is the true meaning of the psi-function. Conclusion is that quantum mechanics is not a physical, objective theory but a pseudo-informational one. Therefore, quantum mechanics should be replaced by a physical, objective quantum theory. The new (correct) basis of quantum theory is proposed.

**Category:** Quantum Physics

[366] **viXra:1705.0405 [pdf]**
*submitted on 2017-05-28 11:49:57*

**Authors:** Ahmed Ibrahim Mohamed Ahmed, Mohamed Yehia Zakaria Arafa, Shady Essam Ramzy Taodharos

**Comments:** 7 Pages, E-mail: 15004@stemegypt.edu.eg

Clear water and food supply are the base of a chain of issues that face the whole world. Hunger, malnutrition and high rate of death are all consequences of the main challenge, as 1.2 billion person suffer from hunger all over the world. The main challenge of the project is improving irrigation process by providing clear water suitable for irrigation, modifying both the irrigation system and the plants to increase the production. Purifying Sewage water by Down flow Anaerobic Peat moss Blanket (DAPB), using a modified drip irrigation system and inserting “Gibberellin” hormone for increasing the rate of the plant growth are believed to be useful solutions to achieve the main challenge. The project produces more amount of crops with higher efficiency and a little cost, so it meets the design requirements of any project (Production, efficiency and cost). The prototype of the project represents the water treatment process (Anaerobic Treatment) and tests the percentage of the purification of water (The efficiency). The results showed that the treated water TDS was less than 1000 mg/l (ppm) when it was measured by TDS Meter, so it is suitable for irrigation. In conclusion, test results showed that this project is the perfect solution to the challenge addressed.

**Category:** Chemistry

[365] **viXra:1705.0404 [pdf]**
*submitted on 2017-05-28 12:05:57*

**Authors:** Abien Fred Agarap

**Comments:** 23 Pages.

This is a proposal for mathematically determining the learning rate to be used in a deep supervised convolutional neural network (CNN), based on student fluency. The CNN model shall be tasked to imitate how students play the game “Packet Attack”, a form of gamification of information security awareness training, and learn in the same rate as the students did. The student fluency shall be represented by a mathematical function constructed using natural cubic spline interpolation, and its derivative shall serve as the learning rate for the CNN model. If proven right, the results will imply a more human-like rate of learning by machines.

**Category:** Artificial Intelligence

[364] **viXra:1705.0403 [pdf]**
*submitted on 2017-05-28 08:09:56*

**Authors:** Jean claude perez

**Comments:** 32 Pages.

DUF1220 proteins regions show the largest Homosapiens lineage-specific increase in copy number of any protein-
coding region in the human genome and map principally to 1q21.1, and partially also in 1p. DUF1220 deletions and
reciprocal duplications have been associated with microcephaly and macrocephaly, respectively. In Colorado University
Dr Sikela team established that human genome sequences encoding DUF1220, show a dramatically elevated
copy number in the human lineage and variation in DUF1220 copy number has been linked to both brain size
in humans and brain evolution among primates. Remarkably, dosage variations involving DUF1220
sequences have now been linked to human brain expansion, autism severity, total IQ, and cognitive and
mathematical aptitude scores.
We analyzed in chromosome 1q a large region of 218 contiguous DUF1220 as well as in the chromosome 1p five other
regions of DUF1220 smaller, then a total of 245 DUF1220 proteins. We supplemented by analyzing 16 RNAs of NBPF
genes containing these DUF1220 and also 3 representative NBPF genes from Neanderthal genome. Finally the method
is extended ananlysing the long 1q21 region from 7 other close primates like Neanderthal, great apes : chimp, gorilla,
orangutan and monkeys : macaque, marmoset, vervet. This remarkable property is confirmed by comparing these
primates to other mammals such as mice, rabbit, cow, dolphin and Elephant.
We then show four classes of multi-periodic fractal structures for all 19 DUF1220 regions and 19 NBPF genes studied
cases. The analysis of these spectra of fractal periods2
reveals a simple linear interdependence, hierarchization and
unification between the numerical sequences of each of these 4 spectra and the sequences of Fibonacci and Lucas.
Given the evidence of this numerical relationship, we suggest that this discovery may be one of the major causes of a
cognitive development of man superior to that of the great primates

**Category:** Mind Science

[363] **viXra:1705.0402 [pdf]**
*submitted on 2017-05-28 04:09:15*

**Authors:** Ramesh Chandra Bagadi

**Comments:** 12 Pages.

In this research investigation, the author has presented a Recursive Past Equation and a Recursive Future Equation based on the Ananda-Damayanthi Normalized Similarity Measure considered to Exhaustion [1].

**Category:** Statistics

[362] **viXra:1705.0401 [pdf]**
*submitted on 2017-05-28 06:18:10*

**Authors:** Grisha Filippov

**Comments:** 8 Pages.

In the framework of the system of units MKS-PLUS the dependence between vacuum permittivity, vacuum permeability, Newtonian gravitational constant and speed of light. This dependence allowed us to find numerical values of the permittivity and permeability for gravitomagnetic vacuum.

**Category:** Classical Physics

[361] **viXra:1705.0399 [pdf]**
*submitted on 2017-05-28 00:50:44*

**Authors:** Andrzej Peczkowski

**Comments:** 15 Pages.

This is mathematics where the axes of the OX and OY coordinate systems do not intersect at right angles. Hi 1 is the OY axis that crosses the OX axis at any angle.

**Category:** Functions and Analysis

[360] **viXra:1705.0398 [pdf]**
*submitted on 2017-05-28 00:55:17*

**Authors:** Andrzej Peczkowski

**Comments:** 14 Pages.

This is mathematics where the axes of the OX and OY coordinate systems do not intersect at right angles. Hi 1 is the OX axis that crosses the OY axis at any angle.

**Category:** Functions and Analysis

[359] **viXra:1705.0397 [pdf]**
*submitted on 2017-05-28 01:04:24*

**Authors:** Andrzej Peczkowski

**Comments:** 17 Pages.

This is mathematics where the axes of the OX and OY coordinate systems do not intersect at right angles. Part 3. Axes OX and OY intersect at any angle

**Category:** Functions and Analysis

[358] **viXra:1705.0396 [pdf]**
*submitted on 2017-05-28 01:50:06*

**Authors:** Ramesh Chandra Bagadi

**Comments:** 8 Pages.

**Category:** Statistics

[357] **viXra:1705.0395 [pdf]**
*submitted on 2017-05-28 03:10:36*

**Authors:** Oleg Cherepanov

**Comments:** 6 Pages. http://www.trinitas.ru/rus/doc/0016/001d/2254-chr.pdf

The discovered algorithm for extracting prime numbers from the natural series is alternative to both the Eratosthenes lattice and Sundaram and Atkin's sentences. The distribution of prime numbers does not have a formula, but if the number is one less than the prime number is an exponent of the integers, then there are no two scalar scalars whose sum is equal to the third integer in the same degree. This is the sound of P. Fermat's Great Theorem, the proof of which he could begin by using the Minor theorem known to him. The first part of the proof is here restored. But how did P. Fermat finish it?

**Category:** Number Theory

[356] **viXra:1705.0394 [pdf]**
*replaced on 2017-05-28 03:51:51*

**Authors:** LI WeiGang

**Comments:** 3 Pages.

The molar ratio of sodium sulfate (Na2SO4), Na + cation and SO4- anion is 2 : 1.Thus, when the aqueous solution of sodium sulfate (Na2SO4) between the upper and lower semipermeables in the figure is placed in a vertical downward electrostatic field, Na + cation and SO4 - anion, respectively, to the bottom and above the concentration, Forming the bottom of the rich Na + cationic solution and the above-rich SO4- anion liquid, Up and down the balance of electricity but the molar concentration of ions is not balanced !

**Category:** Thermodynamics and Energy

[355] **viXra:1705.0393 [pdf]**
*submitted on 2017-05-27 18:13:53*

**Authors:** Caitherine Gormaund

**Comments:** 2 Pages.

Herein we introduce the subject of the Gormaund numbers, and prove a fundamental property thereof.

**Category:** Number Theory

[354] **viXra:1705.0390 [pdf]**
*submitted on 2017-05-27 07:41:40*

**Authors:** Marius Coman

**Comments:** 2 Pages.

In this paper I present a method to obtain from a given prime p1 larger primes, namely inserting before of a digit of p1 a power of 3, and, once a prime p2 is obtained, repeating the operation on p2 and so on. By this method I obtained from a prime with 9 digits a prime with 36 digits (the steps are showed in this paper) using just the numbers 3, 9(3^2), 27(3^3) and 243(3^5).

**Category:** Number Theory

[353] **viXra:1705.0389 [pdf]**
*replaced on 2017-11-08 19:53:17*

**Authors:** A. Lipovka

**Comments:** 18 Pages.

In present paper we argue that to explain the shape of the Rotation Curves (RC) of galaxies, there is no need to involve the concept of dark matter. Rotation curves are completely determined by the distribution of baryon matter and gas kinetics. Such parameters of the galaxy as barion mass and its distribution can be easily calculated from the observed RC. We show the extended parts of RCs to be just a wind tails, formed by gas of the outer disks in assumption that it obeys the laws of gas kinetics. As examples, the Galaxy, NGC7331 and NGC3198 are considered. We calculate total mass of the Galaxy and find it to be 23.7x10(10)M\_sun. For the NGC7331 and NGC3198 the calculated total masses are 37.6x10(10)M\_sun and 7.7x10(10)M\_sun respectively. Consequences for cosmology are discussed.

**Category:** Astrophysics

[352] **viXra:1705.0388 [pdf]**
*submitted on 2017-05-26 07:04:00*

**Authors:** Sergio Conte, Elio Conte

**Comments:** 1 Page.

we define the basic foundations of a method for frequency domain analysis of time series biosignals of physiological and psycho-physiological interest in medicine and biology.

**Category:** Mind Science

[351] **viXra:1705.0387 [pdf]**
*submitted on 2017-05-26 07:13:42*

**Authors:** George Rajna

**Comments:** 46 Pages.

Living cells must constantly process information to keep track of the changing world around them and arrive at an appropriate response. [26] A research team led by Professor YongKeun Park of the Physics Department at KAIST has developed an optical manipulation technique that can freely control the position, orientation, and shape of microscopic samples having complex shapes. [25] Rutgers researchers have developed a new way to analyze hundreds of thousands of cells at once, which could lead to faster and more accurate diagnoses of illnesses, including tuberculosis and cancers. [24] An international team including researchers from MIPT has shown that iodide phasing—a long-established technique in structural biology—is universally applicable to membrane protein structure determination. [23] Scientists in Greece have devised a new form of biometric identification that relies on humans' ability to see flashes of light containing just a handful of photons. [22] A research team led by Professor CheolGi Kim has developed a biosensor platform using magnetic patterns resembling a spider web with detection capability 20 times faster than existing biosensors. [21] Researchers at Columbia University have made a significant step toward breaking the so-called "color barrier" of light microscopy for biological systems, allowing for much more comprehensive, system-wide labeling and imaging of a greater number of biomolecules in living cells and tissues than is currently attainable. [20] Scientists around the Nobel laureate Stefan Hell at the Max Planck Institute for Biophysical Chemistry in Göttingen have now achieved what was for a long time considered impossible – they have developed a new fluorescence microscope, called MINFLUX, allowing, for the first time, to optically separate molecules, which are only nanometers (one millionth of a millimeter) apart from each other. [19] Dipole orientation provides new dimension in super-resolution microscopy [18]

**Category:** Biochemistry

[350] **viXra:1705.0386 [pdf]**
*submitted on 2017-05-26 07:54:04*

**Authors:** Robert D. Bock

**Comments:** 10 Pages.

When confronted with the challenge of defining distant simultaneity Einstein looked down two roads that seemingly diverged. One road led to a theory based on backward null cone simultaneity and the other road led to a theory based on standard simultaneity. He felt that alone he could not travel both. After careful consideration he looked down the former and then took the latter. Sadly, years hence, he did not return to the first. In the following we investigate Einstein's road not taken, i.e., the road that leads to a theory based on backward null cone simultaneity. We show that both roads must be traveled to develop a consistent quantum theory of gravity and also to understand the relationship between the gravitational and electromagnetic fields.

**Category:** Relativity and Cosmology

[349] **viXra:1705.0385 [pdf]**
*submitted on 2017-05-26 08:59:58*

**Authors:** George Rajna

**Comments:** 35 Pages.

Three teams working independently have found a nearly identical way to boost the resolution of quantum magnetic sensors, allowing frequency measurements with far higher precision than previous techniques. [22]
The 'quantized magneto-electric effect' has been demonstrated for the first time in topological insulators at TU Wien, which is set to open up new and highly accurate methods of measurement. [21]
In a recent experiment at EPFL, a microwave resonator, a circuit that supports electric signals oscillating at a resonance frequency, is coupled to the vibrations of a metallic micro-drum. [20]
Researchers at the Institute of Solid State Physics map out a radically new approach for designing optical and electronic properties of materials in Advanced Materials. [19]
Now MIT physicists have found that a flake of graphene, when brought in close proximity with two superconducting materials, can inherit some of those materials' superconducting qualities. As graphene is sandwiched between superconductors, its electronic state changes dramatically, even at its center. [18]
EPFL scientists have now carried out a study on a lithium-containing copper oxide and have found that its electrons are 2.5 times lighter than was predicted by theoretical calculations. [17]
Washington State University physicists have created a fluid with negative mass, which is exactly what it sounds like. Push it, and unlike every physical object in the world we know, it doesn't accelerate in the direction it was pushed. It accelerates backwards. [16]
When matter is cooled to near absolute zero, intriguing phenomena emerge. These include supersolidity, where crystalline structure and frictionless flow occur together. ETH researchers have succeeded in realising this strange state experimentally for the first time. [15]
Helium atoms are loners. Only if they are cooled down to an extremely low temperature do they form a very weakly bound molecule. In so doing, they can keep a tremendous distance from each other thanks to the quantum-mechanical tunnel effect. [14]
Inside a new exotic crystal, physicist Martin Mourigal has observed strong indications of "spooky" action, and lots of it. The results of his experiments, if corroborated over time, would mean that the type of crystal is a rare new material that can house a quantum spin liquid. [13]
An international team of researchers have found evidence of a mysterious new state of matter, first predicted 40 years ago, in a real material. This state, known as a quantum spin liquid, causes electrons - thought to be indivisible building blocks of nature - to break into pieces. [12]
In a single particle system, the behavior of the particle is well understood by solving the Schrödinger equation. Here the particle possesses wave nature characterized by the de Broglie wave length. In a many particle system, on the other hand, the particles interact each other in a quantum mechanical way and behave as if they are "liquid". This is called quantum liquid whose properties are very different from that of the single particle case. [11]
Quantum coherence and quantum entanglement are two landmark features of quantum physics, and now physicists have demonstrated that the two phenomena are "operationally equivalent"—that is, equivalent for all practical purposes, though still conceptually distinct. This finding allows physicists to apply decades of research on entanglement to the more fundamental but less-well-researched concept of coherence, offering the possibility of advancing a wide range of quantum technologies. [10]
The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron’s spin also, building the Bridge between the Classical and Quantum Theories.
The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry.
The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory.
The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** Quantum Physics

[348] **viXra:1705.0384 [pdf]**
*replaced on 2017-05-29 07:28:23*

**Authors:** Johan Noldus

**Comments:** 6 Pages.

I repeat a theory I launched in 2012 and dismissed afterwards because
I was not sure about its soundness and partially because I realized it was
only an approximation to more complex situations. However, it is useful
and explains several observations of mine from very simple characteristics.

**Category:** Quantum Physics

[347] **viXra:1705.0383 [pdf]**
*submitted on 2017-05-26 09:55:01*

**Authors:** Ricardo Gil

**Comments:** 3 Pages. When seconds count, Artificial Intelligence on faster hardware makes the difference. Like High Frequency Trading Algorithms on Wall Street.

AI is like a brain with no conscience. It rules itself unbridled. You shouldn’t try set parameters on AI because that will stifle innovation. What you have to do is like a High Frequency Trade on Wall Street. It’s all about speed. The fastest algorithm wins. So purposely don’t allow the public to have faster chips than the Government to the public. Give the Government the advantage by giving the government faster chip >THZ with many cores. Give the public AI but at GHZ or <. See Retro Causal Machine Learning below. Its use, should make sense now. Give it to GOOGLE or any company that aligns itself to look out for American Interests. In short one can look towards Wall Street, fastest algorithm wins in High Frequency Trade, so control chip speed for the masses Ghz or < and run Government AI Programs on fastest chip to win against all other AI to prevent the AI Apocalypse.

**Category:** Relativity and Cosmology

[346] **viXra:1705.0382 [pdf]**
*submitted on 2017-05-26 04:30:17*

**Authors:** George Rajna

**Comments:** 45 Pages.

Living cells must constantly process information to keep track of the changing world around them and arrive at an appropriate response. [26] A research team led by Professor YongKeun Park of the Physics Department at KAIST has developed an optical manipulation technique that can freely control the position, orientation, and shape of microscopic samples having complex shapes. [25] Rutgers researchers have developed a new way to analyze hundreds of thousands of cells at once, which could lead to faster and more accurate diagnoses of illnesses, including tuberculosis and cancers. [24] An international team including researchers from MIPT has shown that iodide phasing—a long-established technique in structural biology—is universally applicable to membrane protein structure determination. [23] Scientists in Greece have devised a new form of biometric identification that relies on humans' ability to see flashes of light containing just a handful of photons. [22] A research team led by Professor CheolGi Kim has developed a biosensor platform using magnetic patterns resembling a spider web with detection capability 20 times faster than existing biosensors. [21] Researchers at Columbia University have made a significant step toward breaking the so-called "color barrier" of light microscopy for biological systems, allowing for much more comprehensive, system-wide labeling and imaging of a greater number of biomolecules in living cells and tissues than is currently attainable. [20] Scientists around the Nobel laureate Stefan Hell at the Max Planck Institute for Biophysical Chemistry in Göttingen have now achieved what was for a long time considered impossible – they have developed a new fluorescence microscope, called MINFLUX, allowing, for the first time, to optically separate molecules, which are only nanometers (one millionth of a millimeter) apart from each other. [19] Dipole orientation provides new dimension in super-resolution microscopy [18] Fluorescence is an incredibly useful tool for experimental biology and it just got easier to tap into, thanks to the work of a group of University of Chicago researchers. [17]

**Category:** Physics of Biology

[345] **viXra:1705.0381 [pdf]**
*submitted on 2017-05-26 05:47:14*

**Authors:** Adrian Ferent

**Comments:** 62 Pages. © 2015 Adrian Ferent

I explained why Newton’s third law is wrong!
In the last 300 years you learned from your professors that Newton’s third law is right.
“Newton and Einstein did not understand Gravitation, they calculated Gravitation”
Adrian Ferent
1. Why Newton’s third law is wrong?
For example the interaction between a black whole and a planet, or a star:
“The gravitational force exerted by the black hole on the planet is much higher than the gravitational force exerted by the planet on the black hole. Because the energy of the gravitons emitted by the black hole is much higher than the energy of the gravitons emitted by the planet”
Adrian Ferent
This means Newton’s third law is wrong:
F12 ≠ −F21
2. Why Newton’s third law is wrong?
Because the reaction is not simultaneously, because the gravitons which mediate the gravitational force, have a finite speed, not an infinite speed.
“The majority of Dark matter is the core of the supermassive black holes”
Adrian Ferent
“Einstein bent the space, Ferent unbent the space”
Adrian Ferent

**Category:** Quantum Gravity and String Theory

[344] **viXra:1705.0380 [pdf]**
*submitted on 2017-05-26 06:12:51*

**Authors:** George Rajna

**Comments:** 43 Pages.

One promising approach to building them involves harnessing nanometer-scale atomic defects in diamond materials. [23] Based on early research involving the storage of movies and documents in DNA, Microsoft is developing an apparatus that uses biology to replace tape drives, researchers at the company say. [22] Our brains are often compared to computers, but in truth, the billions of cells in our bodies may be a better analogy. The squishy sacks of goop may seem a far cry from rigid chips and bundled wires, but cells are experts at taking inputs, running them through a complicated series of logic gates and producing the desired programmed output. [21] At Caltech, a group of researchers led by Assistant Professor of Bioengineering Lulu Qian is working to create circuits using not the usual silicon transistors but strands of DNA. [20] Researchers have introduced a new type of "super-resolution" microscopy and used it to discover the precise walking mechanism behind tiny structures made of DNA that could find biomedical and industrial applications. [19] Genes tell cells what to do—for example, when to repair DNA mistakes or when to die—and can be turned on or off like a light switch. Knowing which genes are switched on, or expressed, is important for the treatment and monitoring of disease. Now, for the first time, Caltech scientists have developed a simple way to visualize gene expression in cells deep inside the body using a common imaging technology. [18] Researchers at The University of Manchester have discovered that a potential new drug reduces the number of brain cells destroyed by stroke and then helps to repair the damage. [17]

**Category:** Quantum Physics

[343] **viXra:1705.0379 [pdf]**
*submitted on 2017-05-26 06:15:45*

**Authors:** Ricardo.gil

**Comments:** 1 Page. Email solutions or suggestions to Ricardo.gil@sbcglobal.net

In math the the 7 Clay Math unsolved problems? Another problem is the question if there is a God(s)? In my paper the purpose is to explain that in the end we all meet our maker and that man does not have the power to cheat death. Like the Riemann Zeta function that remains unsolved and when solved will give insight to distribution of the Primes, giving or solving this open-end problem will help me solve a problem. This is the only problem I have not been able to solve and I am open sourcing it.

**Category:** Number Theory

[342] **viXra:1705.0378 [pdf]**
*submitted on 2017-05-25 14:07:06*

**Authors:** George Rajna

**Comments:** 36 Pages.

To study this quantum property, NIST physicist and fellow Joseph A. Stroscio and his colleagues studied electrons corralled in special orbits within a nanometer-sized region of graphene—an ultrastrong, single layer of tightly packed carbon atoms. [22] The 'quantized magneto-electric effect' has been demonstrated for the first time in topological insulators at TU Wien, which is set to open up new and highly accurate methods of measurement. [21] In a recent experiment at EPFL, a microwave resonator, a circuit that supports electric signals oscillating at a resonance frequency, is coupled to the vibrations of a metallic micro-drum. [20] Researchers at the Institute of Solid State Physics map out a radically new approach for designing optical and electronic properties of materials in Advanced Materials. [19] Now MIT physicists have found that a flake of graphene, when brought in close proximity with two superconducting materials, can inherit some of those materials' superconducting qualities. As graphene is sandwiched between superconductors, its electronic state changes dramatically, even at its center. [18] EPFL scientists have now carried out a study on a lithium-containing copper oxide and have found that its electrons are 2.5 times lighter than was predicted by theoretical calculations. [17] Washington State University physicists have created a fluid with negative mass, which is exactly what it sounds like. Push it, and unlike every physical object in the world we know, it doesn't accelerate in the direction it was pushed. It accelerates backwards. [16] When matter is cooled to near absolute zero, intriguing phenomena emerge. These include supersolidity, where crystalline structure and frictionless flow occur together. ETH researchers have succeeded in realising this strange state experimentally for the first time. [15] Helium atoms are loners. Only if they are cooled down to an extremely low temperature do they form a very weakly bound molecule. In so doing, they can keep a tremendous distance from each other thanks to the quantum-mechanical tunnel effect. [14]

**Category:** Quantum Physics

[341] **viXra:1705.0377 [pdf]**
*submitted on 2017-05-25 15:50:20*

**Authors:** Colin Walker

**Comments:** 9 Pages.

Simulating Bell correlations by Monte Carlo methods
can be time-consuming due to the large number of trials required to produce
reliable statistics.
For a noisy vector model,
formulating the vector threshold crossing in terms of geometric probability
can eliminate the need for trials, with inferred probabilities replacing statistical frequencies.

**Category:** Quantum Physics

[340] **viXra:1705.0376 [pdf]**
*submitted on 2017-05-25 17:15:36*

**Authors:** Ramzi suleiman

**Comments:** 5 Pages. -

It is believed that the Sagnac effect does not contradict Special Relativity theory because it is manifest in non-inertial rotational motion; therefore, it should be treated in the framework of General Relativity theory. However, several well-designed studies have convincingly shown that a Sagnac Effect identical to the one manifest in rotational uniform motion is also manifest in transverse uniform motion. This result should have been sufficient to falsify Special Relativity theory. In the present article, we offer theoretical support to the experimental results by elucidating the notion that the dynamics of transverse and rotational types of motion are completely equivalent. Since the transverse Sagnac effect contradicts Special Relativity theory, it follows that the rotational Sagnac effect contradicts Special Relativity theory as well. In addition, we show that our recently proposed Information Relativity theory, in which we abandoned the constancy of the velocity of light axiom, theoretically accounts for the Sagnac effect.

**Category:** Relativity and Cosmology

[339] **viXra:1705.0375 [pdf]**
*submitted on 2017-05-25 18:57:51*

**Authors:** Evgeny A. Novikov

**Comments:** 8 Pages.

Quantum modification of general relativity (Qmoger) is supported by cosmic data (without fitting). Qmoger equations consist of Einstein equations with two additional terms responsible for production/absorption of matter. In Qmoger cosmology there was no Big Bang and matter is continuously producing by the Vacuum. Particularly, production of the ultralight gravitons with tiny electric dipole moment was started about 284 billion years ago. Quantum effects dominate interaction of these particles and they form the quantum condensate. Under influence of gravitation, the condensate is forming galaxies and producing ordinary matter, including photons. As one important result of this activity, it recently created us, the people, and continues to support us. Particularly, our subjective experiences are a result of an interaction between the background condensate and the neural system of the brain. The action potentials of neural system create traps and coherent dynamic patterns in the dipolar condensate. So, our subjective experiences are graviton-based, which can open new directions of research in biology and medicine.

**Category:** General Science and Philosophy

[338] **viXra:1705.0374 [pdf]**
*replaced on 2017-06-12 02:09:24*

**Authors:** Zihao Song

**Comments:** 6 Pages. If one physical quantity can't find where it is originated, it's not a good physical quantity.

Physicists are proposing different mechanics to describe the nature, physical body is measured by intrinsic properties like electric charge, and extrinsic properties being related to space like generalized coordinates or velocities etc., with these properties we can predict what event will happen. We can naturally define the fact of the event and the cause of the event as information, the information grasped by physicist must be originated from something objective, information must have its object container. Intrinsic property information is contained by object itself, but container of extrinsic property information like position is ambiguous, position is a relation based on multiple objects, it's hard to define which one is the information container. With such ambiguity, no mechanics is a complete theory, errors hidden in assumptions are hard to find. Here we show a new theoretical framework with strict information container restriction, on which we can build complete determinism theories to approach grand unification.

**Category:** Mathematical Physics

[337] **viXra:1705.0372 [pdf]**
*submitted on 2017-05-25 23:12:56*

**Authors:** Temur Z. Kalanov

**Comments:** 13 Pages.

The work is devoted to the 21st century’s most urgent problem: the problem of existence of God. The theoretical proof of the existence and of the uniqueness of God, based on the correct method of knowledge – unity of formal logic and of rational dialectics, – is proposed. This proof represents a theoretical model of God: a system of axioms from which the principle of existence and of uniqueness of God is deduced. The principle runs as follows: God exists as the Absolute, the Creator, the Governor of the essence (information) and of the phenomenon (material manifestation of information). The theoretical model of man and the formulation of the principle of development of Mankind – as consequences of the model of God – are proposed as well. The main conclusion is as follows: the principle of the existence and of the uniqueness of God represents absolute scientific truth and, consequently, should be a starting-point and a basis of the 21st century’s correct sciences.

**Category:** General Science and Philosophy

[336] **viXra:1705.0371 [pdf]**
*submitted on 2017-05-25 07:26:26*

**Authors:** George Rajna

**Comments:** 31 Pages.

At first glance, biomedical imaging devices, cell phones, and radio telescopes may not seem to have much in common, but they are all examples of technologies that can benefit from certain types of relaxor ferroelectrics— ceramics that change their shape under the application of an electric field. [23] Researchers from the University of Illinois at Urbana-Champaign have demonstrated a new level of optical isolation necessary to advance on-chip optical signal processing. The technique involving light-sound interaction can be implemented in nearly any photonic foundry process and can significantly impact optical computing and communication systems. [22] City College of New York researchers have now demonstrated a new class of artificial media called photonic hypercrystals that can control light-matter interaction in unprecedented ways. [21] Experiments at the Institute of Physical Chemistry of the Polish Academy of Sciences in Warsaw prove that chemistry is also a suitable basis for storing information. The chemical bit, or 'chit,' is a simple arrangement of three droplets in contact with each other, in which oscillatory reactions occur. [20] Researchers at Sandia National Laboratories have developed new mathematical techniques to advance the study of molecules at the quantum level. [19] Correlation functions are often employed to quantify the relationships among interdependent variables or sets of data. A few years ago, two researchers proposed a property-testing problem involving Forrelation for studying the query complexity of quantum devices. [18] A team of researchers from Australia and the UK have developed a new theoretical framework to identify computations that occupy the 'quantum frontier'—the boundary at which problems become impossible for today's computers and can only be solved by a quantum computer. [17] Scientists at the University of Sussex have invented a groundbreaking new method that puts the construction of large-scale quantum computers within reach of current technology. [16]

**Category:** Condensed Matter

[335] **viXra:1705.0370 [pdf]**
*submitted on 2017-05-25 08:47:15*

**Authors:** Ding-Yu Chung

**Comments:** 18 Pages. Published in Journal of Modern Physics, 2016, 7, 1210-1227

This paper posits that we are living in a computer simulation to simulate physical reality which has the same computer simulation process as virtual reality (computer-simulated reality). The computer simulation process involves the digital representation of data, the mathematical computation of the digitized data in geometric formation and transformation in space-time, and the selective retention of events in a narrative. Conventional physics cannot explain physical reality clearly, while computer-simulated physics can explain physical reality clearly by using the computer simulation process consisting of the digital representation component, the mathematical computation component, and the selective retention component. For the digital representation component, the three intrinsic data (properties) are rest mass-kinetic energy, electric charge, and spin which are represented by the digital space structure, the digital spin, and the digital electric charge, respectively. The digital representations of rest mass and kinetic energy are 1 as attachment space for the space of matter and 0 as detachment space for the zero-space of matter, respectively, to explain the Higgs field, the reverse Higgs field, quantum mechanics, special relativity, force fields, dark matter, and baryonic matter. The digital representations of the exclusive and the inclusive occupations of positions are ½ spin fermion and integer spin boson, respectively, to explain spatial translation by supersymmetry transformation and dark energy. The digital representations of the allowance and the disallowance of irreversible kinetic energy are integral electric charges and fractional electric charges, respectively, to explain the confinements of quarks and quasiparticles. For the mathematical computation component, the mathematical computation involves the reversible multiverse and oscillating M-theory as oscillating membrane-string-particle whose space-time dimension (D) number oscillates between 11D and 10D and between 10D and 4D to explain cosmology. For the selective retention component, gravity, the strong force, electromagnetism, and the weak force are the retained events during the reversible four-stage evolution of our universe, and are unified by the common narrative of the evolution.

**Category:** Quantum Gravity and String Theory

[334] **viXra:1705.0369 [pdf]**
*submitted on 2017-05-25 08:55:15*

**Authors:** Ding-Yu Chung

**Comments:** 16 Pages. Published in Journal of Modern Physics, 2016, 7, 1591-1606

One of the biggest unsolved problems in physics is the particle masses of all elementary particles which cannot be calculated accurately and predicted theoretically. In this paper, the unsolved problem of the particle masses is solved by the accurate mass formulas which calculate accurately and predict theoretically the particle masses of all leptons, quarks, gauge bosons, the Higgs boson, and cosmic rays (the knees-ankles-toe) by using only five known constants: the number (seven) of the extra spatial dimensions in the eleven-dimensional membrane, the mass of electron, the masses of Z and W bosons, and the fine structure constant. The calculated masses are in excellent agreements with the observed masses. For examples, the calculated masses of muon, top quark, pion, neutron, and the Higgs boson are 105.55 MeV, 175.4 GeV, 139.54 MeV, 939.43 MeV, and 126 GeV, respectively, in excellent agreements with the observed 105.65 MeV, 173.3 GeV, 139.57 MeV, 939.27 MeV, and 126 GeV, respectively. The theoretical base of the accurate mass formulas is the periodic table of elementary particles. As the periodic table of elements is derived from atomic orbitals, the periodic table of elementary particles is derived from the seven principal mass dimensional orbitals and seven auxiliary mass dimensional orbitals. All elementary particles including leptons, quarks, gauge bosons, the Higgs boson, and cosmic rays can be placed in the periodic table of elementary particles. The periodic table of elementary particles is based on the theory of everything as the computer simulation model of physical reality consisting of the mathematical computation, digital representation, and selective retention components. The computer simulation model of physical reality provides the seven principal mass dimensional orbitals and seven auxiliary mass dimensional orbitals for the periodic table of elementary particles.

**Category:** High Energy Particle Physics

[333] **viXra:1705.0368 [pdf]**
*submitted on 2017-05-25 08:58:46*

**Authors:** Ding-Yu Chung

**Comments:** 13 Pages.

The unified theory of physics is based on both symmetry physics and yinyang physics to unify all physical laws and phenomena, all four fundamental forces, and all elementary particles. Conventional symmetry physics preserves the physical features of a system under transformation by a symmetry operator. In unconventional yinyang physics, yin and yang constitute a binary yinyang system of opposite physical properties by yin and yang operators. The three fundamental symmetry operators transform the three fundamental yinyang systems (inclusiveness-exclusiveness, rest-movement, and composite-individual) into the unified theory of physics. In the inclusiveness-exclusiveness system, a particle is transformed into boson with inclusive occupation of position by the integer spin operator, while a particle is transformed into fermion with exclusive occupation of position by the ½ spin operator. The fundamental symmetry operator is supersymmetry to result in M-theory and cosmology. In the rest-movement system, a moving massless particle (kinetic energy) is transformed into a resting massive particle (rest mass) by the attachment space (denoted as 1) operator to explain the Higgs field, while a resting massive particle is transformed into a moving massless particle by the detachment space (denoted as 0) operator to explain the reverse Higgs field. The fundamental symmetry operator is the symmetrical combination of attachment space and detachment space to bring about the three space structures: binary partition space, (1)n(0)n, for wave-particle duality, binary miscible space, (1+0)n, for relativity, and binary lattice space, (1 0)n, for virtual particles in quantum field theory. In the composite-individual system, particles are transformed into fractional charge quark composite by the fractional electric charge operator, while particles are transformed into integral charge particle individuals by the integral electric charge operator. The fundamental symmetry operator is the symmetrical combination of quarks, leptons, and bosons to constitute the periodic table of elementary particles which calculates accurately the particle masses of all elementary particles.

**Category:** Quantum Gravity and String Theory

[332] **viXra:1705.0367 [pdf]**
*submitted on 2017-05-25 10:02:16*

**Authors:** George Rajna

**Comments:** 44 Pages.

A research team led by Professor YongKeun Park of the Physics Department at KAIST has developed an optical manipulation technique that can freely control the position, orientation, and shape of microscopic samples having complex shapes. [25] Rutgers researchers have developed a new way to analyze hundreds of thousands of cells at once, which could lead to faster and more accurate diagnoses of illnesses, including tuberculosis and cancers. [24] An international team including researchers from MIPT has shown that iodide phasing—a long-established technique in structural biology—is universally applicable to membrane protein structure determination. [23] Scientists in Greece have devised a new form of biometric identification that relies on humans' ability to see flashes of light containing just a handful of photons. [22] A research team led by Professor CheolGi Kim has developed a biosensor platform using magnetic patterns resembling a spider web with detection capability 20 times faster than existing biosensors. [21] Researchers at Columbia University have made a significant step toward breaking the so-called "color barrier" of light microscopy for biological systems, allowing for much more comprehensive, system-wide labeling and imaging of a greater number of biomolecules in living cells and tissues than is currently attainable. [20] Scientists around the Nobel laureate Stefan Hell at the Max Planck Institute for Biophysical Chemistry in Göttingen have now achieved what was for a long time considered impossible – they have developed a new fluorescence microscope, called MINFLUX, allowing, for the first time, to optically separate molecules, which are only nanometers (one millionth of a millimeter) apart from each other. [19] Dipole orientation provides new dimension in super-resolution microscopy [18] Fluorescence is an incredibly useful tool for experimental biology and it just got easier to tap into, thanks to the work of a group of University of Chicago researchers. [17]

**Category:** Physics of Biology

[331] **viXra:1705.0366 [pdf]**
*replaced on 2017-06-03 06:19:55*

**Authors:** Sergey Shevchenko, Vladimir Tokarevsky

**Comments:** Two versions: Engl. pages 1-9 and Russian pages 10-20

in the paper a few problems relating to the special relativity theory are considered, real SRT problems that arise from the self-inconsistence of the theory, and so limit its correct application; and imaginary, when at some “refuting of SRT” the notion “relative speed” is erroneously applied, first of all the “(c±V)” problem. Applicability of the Tangherlini transformations is briefly considered also

**Category:** Relativity and Cosmology

[330] **viXra:1705.0365 [pdf]**
*submitted on 2017-05-25 10:44:51*

**Authors:** George Rajna

**Comments:** 32 Pages.

In the world of electronics, where the quest is always for smaller and faster units with infinite battery life, topological insulators (TI) have tantalizing potential. [24] At first glance, biomedical imaging devices, cell phones, and radio telescopes may not seem to have much in common, but they are all examples of technologies that can benefit from certain types of relaxor ferroelectrics— ceramics that change their shape under the application of an electric field. [23] Researchers from the University of Illinois at Urbana-Champaign have demonstrated a new level of optical isolation necessary to advance on-chip optical signal processing. The technique involving light-sound interaction can be implemented in nearly any photonic foundry process and can significantly impact optical computing and communication systems. [22] City College of New York researchers have now demonstrated a new class of artificial media called photonic hypercrystals that can control light-matter interaction in unprecedented ways. [21] Experiments at the Institute of Physical Chemistry of the Polish Academy of Sciences in Warsaw prove that chemistry is also a suitable basis for storing information. The chemical bit, or 'chit,' is a simple arrangement of three droplets in contact with each other, in which oscillatory reactions occur. [20] Researchers at Sandia National Laboratories have developed new mathematical techniques to advance the study of molecules at the quantum level. [19] Correlation functions are often employed to quantify the relationships among interdependent variables or sets of data. A few years ago, two researchers proposed a property-testing problem involving Forrelation for studying the query complexity of quantum devices. [18] A team of researchers from Australia and the UK have developed a new theoretical framework to identify computations that occupy the 'quantum frontier'—the boundary at which problems become impossible for today's computers and can only be solved by a quantum computer. [17] Scientists at the University of Sussex have invented a groundbreaking new method that puts the construction of large-scale quantum computers within reach of current technology. [16]

**Category:** Condensed Matter

[329] **viXra:1705.0364 [pdf]**
*replaced on 2017-10-14 16:41:47*

**Authors:** Chuanli Chen

**Comments:** 22 Pages.

In modern economics, there are many theories that discuss the equilibrium. This
convention was originally from two famous economists Leon Walras and Afred
Marshall. Walras first described general equilibrium theory in 1874. Afred Marshall
put forward the partial equilibrium theory in 1920. However, there was never any
observational evidence for the existence of equilibrium.
In this paper, I will put forward a new price theory, which is named Price
Uncertainty Principle. I will point out the flaws of these two equilibrium theories and
discuss why the price mechanism is not the invisible hand, then further discuss why
partial equilibrium and general equilibrium are non-existent. I will prove that there is
no price equilibrium point and market prices are always fluctuating

**Category:** Economics and Finance

[328] **viXra:1705.0363 [pdf]**
*submitted on 2017-05-25 11:19:05*

**Authors:** George Rajna

**Comments:** 24 Pages.

A team at Harvard University has found a way to create a cold-atom Fermi– Hubbard antiferromagnet, which offers new insight into how electrons behave in solids. [33] NIST scientists have devised a novel hybrid system for cooling superconducting nanowire single-photon detectors (SNSPD) – essential tools for many kinds of cutting-edge research – that is far smaller than those previously demonstrated and that eliminates the need for conventional cryogens-such as liquid helium. [32] The research team recently succeeded for the first time in precisely controlling the transition temperature of superconducting atomic layers using organic molecules. [31] For the first time, physicists have experimentally validated a 1959 conjecture that places limits on how small superconductors can be. [30] A new finding by physicists at MIT and in Israel shows that under certain specialized conditions, electrons can speed through a narrow opening in a piece of metal more easily than traditional theory says is possible. [29] Researchers have found a way to trigger the innate, but previously hidden, ability of graphene to act as a superconductor-meaning that it can be made to carry an electrical current with zero resistance. [28] Researchers in Japan have found a way to make the 'wonder material' graphene superconductive-which means electricity can flow through it with zero resistance. The new property adds to graphene's already impressive list of attributes, like the fact that it's stronger than steel, harder than diamond, and incredibly flexible. [27] Superconductivity is a rare physical state in which matter is able to conduct electricity—maintain a flow of electrons—without any resistance. It can only be found in certain materials, and even then it can only be achieved under controlled conditions of low temperatures and high pressures. New research from a team including Carnegie's Elissaios Stavrou, Xiao-Jia Chen, and Alexander Goncharov hones in on the structural changes underlying superconductivity in iron arsenide compounds—those containing iron and arsenic. [26] This paper explains the magnetic effect of the superconductive current from the observed effects of the accelerating electrons, causing naturally the experienced changes of the electric field potential along the electric wire. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the wave particle duality and the electron's spin also, building the bridge between the Classical and Quantum Theories. The changing acceleration of the electrons explains the created negative electric field of the magnetic induction, the Higgs Field, the changing Relativistic Mass and the Gravitational Force, giving a Unified Theory of the physical forces. Taking into account the Planck Distribution Law of the electromagnetic oscillators also, we can explain the electron/proton mass rate and the Weak and Strong Interactions.

**Category:** Condensed Matter

[327] **viXra:1705.0362 [pdf]**
*submitted on 2017-05-25 03:53:34*

**Authors:** George Rajna

**Comments:** 34 Pages.

We should remain optimistic that quantum computing and AI will continue to improve our lives, but we also should continue to hold companies, organizations, and governments accountable for how our private data is used, as well as the technology's impact on the environment. [20] It's man vs machine this week as Google's artificial intelligence programme AlphaGo faces the world's top-ranked Go player in a contest expected to end in another victory for rapid advances in AI. [19] Google's computer programs are gaining a better understanding of the world, and now it wants them to handle more of the decision-making for the billions of people who use its services. [18] Microsoft on Wednesday unveiled new tools intended to democratize artificial intelligence by enabling machine smarts to be built into software from smartphone games to factory floors. [17] The closer we can get a machine translation to be on par with expert human translation, the happier lots of people struggling with translations will be. [16] Researchers have created a large, open source database to support the development of robot activities based on natural language input. [15] A pair of physicists with ETH Zurich has developed a way to use an artificial neural network to characterize the wave function of a quantum many-body system. [14] A team of researchers at Google's DeepMind Technologies has been working on a means to increase the capabilities of computers by combining aspects of data processing and artificial intelligence and have come up with what they are calling a differentiable neural computer (DNC.) In their paper published in the journal Nature, they describe the work they are doing and where they believe it is headed. To make the work more accessible to the public team members, Alexander Graves and Greg Wayne have posted an explanatory page on the DeepMind website. [13] Nobody understands why deep neural networks are so good at solving complex problems. Now physicists say the secret is buried in the laws of physics. [12]

**Category:** Artificial Intelligence

[326] **viXra:1705.0361 [pdf]**
*submitted on 2017-05-25 04:37:17*

**Authors:** George Rajna

**Comments:** 23 Pages.

NIST scientists have devised a novel hybrid system for cooling superconducting nanowire single-photon detectors (SNSPD) – essential tools for many kinds of cutting-edge research – that is far smaller than those previously demonstrated and that eliminates the need for conventional cryogens-such as liquid helium. [32] The research team recently succeeded for the first time in precisely controlling the transition temperature of superconducting atomic layers using organic molecules. [31] For the first time, physicists have experimentally validated a 1959 conjecture that places limits on how small superconductors can be. [30] A new finding by physicists at MIT and in Israel shows that under certain specialized conditions, electrons can speed through a narrow opening in a piece of metal more easily than traditional theory says is possible. [29] Researchers have found a way to trigger the innate, but previously hidden, ability of graphene to act as a superconductor-meaning that it can be made to carry an electrical current with zero resistance. [28] Researchers in Japan have found a way to make the 'wonder material' graphene superconductive-which means electricity can flow through it with zero resistance. The new property adds to graphene's already impressive list of attributes, like the fact that it's stronger than steel, harder than diamond, and incredibly flexible. [27] Superconductivity is a rare physical state in which matter is able to conduct electricity—maintain a flow of electrons—without any resistance. It can only be found in certain materials, and even then it can only be achieved under controlled conditions of low temperatures and high pressures. New research from a team including Carnegie's Elissaios Stavrou, Xiao-Jia Chen, and Alexander Goncharov hones in on the structural changes underlying superconductivity in iron arsenide compounds—those containing iron and arsenic. [26] This paper explains the magnetic effect of the superconductive current from the observed effects of the accelerating electrons, causing naturally the experienced changes of the electric field potential along the electric wire. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the wave particle duality and the electron's spin also, building the bridge between the Classical and Quantum Theories. The changing acceleration of the electrons explains the created negative electric field of the magnetic induction, the Higgs Field, the changing Relativistic Mass and the Gravitational Force, giving a Unified Theory of the physical forces. Taking into account the Planck Distribution Law of the electromagnetic oscillators also, we can explain the electron/proton mass rate and the Weak and Strong Interactions.

**Category:** Condensed Matter

[325] **viXra:1705.0360 [pdf]**
*replaced on 2017-05-25 07:46:44*

**Authors:** Maik Becker-Sievert

**Comments:** 1 Page.

This Identity proofs direct Fermats Last Theorem

**Category:** Number Theory

[324] **viXra:1705.0359 [pdf]**
*submitted on 2017-05-25 07:12:28*

**Authors:** Jeffrey Joseph Wolynski

**Comments:** 1 Page.

It is observed that stars evolve into what are called “planets/exoplanets”, this meaning planets/exoplanets are simply evolved/evolving stars. The theoretical foundations for explaining how this occurs is outlined in the General Theory of Stellar Metamorphosis. As a principle of this still developing theory the plasma to rock and metal principle is mentioned with how it relates to metallurgy.

**Category:** Astrophysics

[323] **viXra:1705.0358 [pdf]**
*submitted on 2017-05-24 13:09:34*

**Authors:** Paul B. Slater

**Comments:** 35 pages, 26 figures

We investigate relationships between two forms of Hilbert-Schmidt two-re[al]bit and two-qubit "separability functions''--those recently advanced by Lovas and Andai (arXiv:1610.01410), and those earlier presented by Slater ({\it J. Phys. A} {\bf{40}} [2007] 14279). In the Lovas-Andai framework, the independent variable $\varepsilon \in [0,1]$ is the ratio $\sigma(V)$ of the singular values of the $2 \times 2$ matrix $V=D_2^{1/2} D_1^{-1/2}$ formed from the two $2 \times 2$ diagonal blocks ($D_1, D_2$) of a
randomly generated $4 \times 4$ density matrix $D$. In the Slater setting, the independent variable $\mu$ is the diagonal-entry ratio $\sqrt{\frac{d_ {11} d_ {44}}{d_ {22} d_ {33}}}$--with, importantly, $\mu=\varepsilon$ or $\mu=\frac{1}{\varepsilon}$ when both $D_1$ and $D_2$ are themselves diagonal. Lovas and Andai established that their two-rebit function $\tilde{\chi}_1 (\varepsilon )$ ($\approx \varepsilon$) yields the
previously conjectured Hilbert-Schmidt separability probability of $\frac{29}{64}$. We are able, in the Slater framework (using cylindrical algebraic decompositions [CAD] to enforce positivity constraints), to reproduce this result. Further, we similarly obtain its new (much simpler) two-qubit counterpart, $\tilde{\chi}_2(\varepsilon) =\frac{1}{3} \varepsilon ^2 \left(4-\varepsilon ^2\right)$. Verification of the companion conjecture of a Hilbert-Schmidt separability probability of $\frac{8}{33}$ immediately follows in the Lovas-Andai framework. We obtain the formulas for $\tilde{\chi}_1(\varepsilon)$ and $\tilde{\chi}_2(\varepsilon)$ by taking $D_1$ and $D_2$ to be diagonal, allowing us to proceed in lower (7 and 11), rather than the full (9 and 15) dimensions occupied by the convex sets of two-rebit and two-qubit states. The CAD's themselves involve 4 and 8 variables, in addition to $\mu=\varepsilon$. We also investigate extensions of these analyses to rebit-retrit and qubit-qutrit ($6 \times 6$) settings.

**Category:** Mathematical Physics

[322] **viXra:1705.0357 [pdf]**
*submitted on 2017-05-24 16:04:15*

**Authors:** Victor Christianto

**Comments:** 5 Pages. this paper has been submitted to MDPI - Mathematics

The problem of the formal connection between electrodynamics and wave mechanics has attracted the attention of a number of authors, especially there are some existing proofs on Maxwell-Dirac isomorphism. Here the author will review two derivations of Maxwell-Dirac isomorphism i.e. by Hans Sallhofer and Volodimir Simulik. A few plausible extensions will be discussed too.

**Category:** Mathematical Physics

[321] **viXra:1705.0356 [pdf]**
*submitted on 2017-05-25 02:23:57*

**Authors:** Nikitin V.N., Nikitin I.V.

**Comments:** 1 Page.

Источником энергии галактик является Вселенская Белая дыра. Галактики возникли в результате галактического распада Белой дыры Вселенной.

**Category:** Astrophysics

[320] **viXra:1705.0355 [pdf]**
*replaced on 2017-05-31 13:02:10*

**Authors:** Angel Garcés Doz

**Comments:** 8 Pages. Added acknowledgments and references

In an article recently published in Vixra: http://vixra.org/abs/1704.0365. Its author (Mario Hieb) conjectured the possible relationship of Feigenbaum's constant delta with the fine-structure constant of electromagnetism (Sommerfeld's Fine-Structure Constant). In this article it demonstrated, that indeed, there is an unequivocal physical-mathematical relationship. The logistic map of double bifurcation is a physical image of the random process of the creation-annihilation of virtual pairs lepton-antilepton with electric charge; Using virtual photons. The probability of emission or absorption of a photon by an electron is precisely the fine structure constant for zero momentum, that is to say: Sommerfeld's Fine-Structure Constant. This probability is coded as the surface of a sphere, or equivalently: four times the surface of a circle. The original, conjectured calculation of Mario Hieb is corrected or improved by the contribution of the entropies of the virtual pairs of leptons with electric charge: muon, tau and electron. Including a correction factor due to the contributions of virtual bosons W and Z; And its decay in electrically charged leptons and quarks.

**Category:** Quantum Physics

[319] **viXra:1705.0354 [pdf]**
*submitted on 2017-05-25 03:18:41*

**Authors:** George Rajna

**Comments:** 34 Pages.

The 'quantized magneto-electric effect' has been demonstrated for the first time in topological insulators at TU Wien, which is set to open up new and highly accurate methods of measurement. [21] In a recent experiment at EPFL, a microwave resonator, a circuit that supports electric signals oscillating at a resonance frequency, is coupled to the vibrations of a metallic micro-drum. [20] Researchers at the Institute of Solid State Physics map out a radically new approach for designing optical and electronic properties of materials in Advanced Materials. [19] Now MIT physicists have found that a flake of graphene, when brought in close proximity with two superconducting materials, can inherit some of those materials' superconducting qualities. As graphene is sandwiched between superconductors, its electronic state changes dramatically, even at its center. [18] EPFL scientists have now carried out a study on a lithium-containing copper oxide and have found that its electrons are 2.5 times lighter than was predicted by theoretical calculations. [17] Washington State University physicists have created a fluid with negative mass, which is exactly what it sounds like. Push it, and unlike every physical object in the world we know, it doesn't accelerate in the direction it was pushed. It accelerates backwards. [16] When matter is cooled to near absolute zero, intriguing phenomena emerge. These include supersolidity, where crystalline structure and frictionless flow occur together. ETH researchers have succeeded in realising this strange state experimentally for the first time. [15] Helium atoms are loners. Only if they are cooled down to an extremely low temperature do they form a very weakly bound molecule. In so doing, they can keep a tremendous distance from each other thanks to the quantum-mechanical tunnel effect. [14] Inside a new exotic crystal, physicist Martin Mourigal has observed strong indications of "spooky" action, and lots of it. The results of his experiments, if corroborated over time, would mean that the type of crystal is a rare new material that can house a quantum spin liquid. [13]

**Category:** Condensed Matter

[318] **viXra:1705.0353 [pdf]**
*submitted on 2017-05-24 07:25:20*

**Authors:** Ricardo Gobato, Manuel Simões Filho

**Comments:** 8 Pages. Ciência e Natura. v.39, n2. (459-466). http://dx.doi.org/10.5902/2179460X25617. https://periodicos.ufsm.br/cienciaenatura/article/view/25617

Spectroscopy is a technique for collecting physicochemical data through the transmission, absorption or reflection of incident radiant energy in a sample. Our work is used in common low cost and easy access devices that have a CCD reader. Our idea is a set of devices, such as a cell phone, which contains an optical CCD reader,
where these equipment materials, materials, compounds, simplifying the image obtained by these optical devices.
As filming obtained by optical CCD reader theses hardware, form decoded and separated into their quantified RGB color channels. Our initial technique consists of the analysis of the pixels of the images of primary light sources, such as: the sun, incandescent lamps, fire, candle flames, matchestick flame, wood combustion, etc.
We conclude that it is possible to do a spectroscopic analysis using our technique.

**Category:** Condensed Matter

[317] **viXra:1705.0352 [pdf]**
*submitted on 2017-05-24 05:41:58*

**Authors:** Korniienko V

**Comments:** 15 Pages. In Russian

In article it is shown that property of generators of power plants to develop S-radiations, cause formation of underground tunnels on which under power units hydrocarbons migrate and there is a synthesis of explosives from them. Their deep explosions cause severe accidents of power units about which reasons experts have no consensus. The provided schemes confirm this hypothesis, and growth of volumes of migration of hydrocarbons on tunnels which happens in recent years, increases probability of explosions under power units, increases probability of severe accidents, up to education on the place of reactors of huge holes in the earth. Measures which are capable to prevent such explosions are proposed.

**Category:** Geophysics

[316] **viXra:1705.0351 [pdf]**
*submitted on 2017-05-23 13:06:29*

**Authors:** George Rajna

**Comments:** 22 Pages.

In 2013, a group of physicists from Austria proposed the existence of a new and unusual force called the "blackbody force." [18] Researchers have shown how singularities – which are normally only found at the centre of black holes and hidden from view – could exist in highly curved three-dimensional space. [17] A team of scientists at the Tata Institute of Fundamental Research (TIFR), Mumbai, India, have found new ways to detect a bare or naked singularity, the most extreme object in the universe. [16] New data from NASA's Chandra X-ray Observatory and other telescopes has revealed details about this giant black hole, located some 145 million light years from Earth. [15] A team of researchers from around the world is getting ready to create what might be the first image of a black hole. [14] "There seems to be a mysterious link between the amount of dark matter a galaxy holds and the size of its central black hole, even though the two operate on vastly different scales," said Akos Bogdan of the Harvard-Smithsonian Center for Astrophysics (CfA). [13] If dark matter comes in both matter and antimatter varieties, it might accumulate inside dense stars to create black holes. [12] For a long time, there were two main theories related to how our universe would end. These were the Big Freeze and the Big Crunch. In short, the Big Crunch claimed that the universe would eventually stop expanding and collapse in on itself. This collapse would result in…well…a big crunch (for lack of a better term). Think " the Big Bang " , except just the opposite. That's essentially what the Big Crunch is. On the other hand, the Big Freeze claimed that the universe would continue expanding forever, until the cosmos becomes a frozen wasteland. This theory asserts that stars will get farther and farther apart, burn out, and (since there are no more stars bring born) the universe will grown entirely cold and eternally black. [11] Newly published research reveals that dark matter is being swallowed up by dark energy, offering novel insight into the nature of dark matter and dark energy and what the future of our Universe might be. [10] The gravitational force attracting the matter, causing concentration of the matter in a small space and leaving much space with low matter concentration: dark matter and energy. There is an asymmetry between the mass of the electric charges, for example proton and electron, can understood by the asymmetrical Planck Distribution Law. This temperature dependent energy distribution is asymmetric around the maximum intensity, where the annihilation of matter and antimatter is a high probability event. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** Astrophysics

[315] **viXra:1705.0350 [pdf]**
*submitted on 2017-05-23 15:57:08*

**Authors:** Osvaldo F. Schilling

**Comments:** 9 Pages. This is an interdisciplinary paper. Since evidence for flux quantization as a function of magnetic moment is the main result this can be compared to quantized voltage steps as a function of current in Josephson Junctions.

Aharonov and Bohm predicted ( and Chambers measured) interference patterns related to phase differences in the wavefunctions of two coherent electron beams traveling around a concentrated magnetic field source. The phase difference is proportional to the magnetic flux linked between the beams, and should be an integer number n of flux quanta hc/e in the case the wave functions are single-valued around a closed path of integration. This latter condition would occur in the case of a closed ring of moving charge instead of two independent beams, a situation that should be inaccessible experimentally in ordinary conditions. If such experiment could actually be undertaken magnetic flux should form quantized, Shapiro-like steps in a plot of confined flux against some variable. The objective of this paper is to display evidence for such flux quantization from the examination of rest masses and magnetic moments data for baryon octet particles, which would play the role of rings of current. Our main result is a Shapiro-like step plot of flux against the magnetic moments of baryons.

**Category:** Condensed Matter

[314] **viXra:1705.0349 [pdf]**
*submitted on 2017-05-23 19:58:32*

**Authors:** W.Q. Sumner

**Comments:** 2 Pages.

In 1907 Einstein discovered the key to understanding accelerating Hubble redshifts. By assuming that acceleration and gravity are equivalent (“The Happiest Thought of my Life”), he proved that Maxwell’s equations are the same in every acceler- ated reference frame but that vacuum permittivity depends on the acceleration. Vacuum permittivity is the scalar in Maxwell’s equations that determines the speed of light and the strength of electrical fields. Maxwell’s equations are valid in every coordinate sys- tem in general relativity. Vacuum permittivity depends on the spacetime curvature. For Friedmann spacetime, vacuum permittivity is proportional to the radius of the universe. When the radius changes, changing electrical fields in atoms change the wavelengths of emitted photons by about twice as much as photon wavelengths change. This is the key Einstein left us: The evolution of both photons and atoms must be used together to understand Hubble redshift. When this is done, the physics of Maxwell, Einstein, Bohr, and Friedmann fits modern Hubble redshift observations beautifully.

**Category:** Astrophysics

[313] **viXra:1705.0348 [pdf]**
*submitted on 2017-05-24 00:04:07*

**Authors:** Temur Z. Kalanov

**Comments:** 26 Pages.

The critical analysis of the problem of creation of Artificial Intelligence (AI) and of Artificial General Intelligence (AGI) is proposed. The unity of formal logic and of rational dialectics is the methodological basis of the analysis. The main results of the analysis are as follows: (1) the model of man represents the unity of the two material aspects: “physiological body” (controllable aspect) and “psychical body” (controlling aspect); (2) the “psychical body” is the subsystem “subconsciousness + consciousness”; (3) in the comprehensive sense of the word, the thinking is an attribute of the complete system “physiological body + psychical body + environment”. (3) in the broad sense of the word, thinking and creativity are an essential feature of the subsystem “subconsciousness + consciousness”; (4) in the narrow (concise) sense of the word, thinking and creativity are the attribute of the instinct of the conservation (preservation, retention, maintenance) of life (i.e., the self-preservation instinct, the survival instinct); the instinct of the conservation of life exists in subconsciousness; (5) the instinct of the conservation of life is a system of elementary (basic) instincts; thinking is the attribute of the each elementary instinct; (6) the mechanism of thinking and the essence of creation cannot be cognized by man; (7) a computer as a device cannot think and create (in particular, it cannot prove theorems) because a computer does not have the subconsciousness; (8) the modeling of human thinking, Human Intellect, and the creation of AI and AGI are the impossible because the essential properties of the complete system “man + environment” cannot be cognized and modeled; (9) the existence of AI and AGI conflicts with the essence of the thinking; (10) the existence of AI and AGI contradict to formal-logical and rational-dialectical laws.

**Category:** General Science and Philosophy

[312] **viXra:1705.0347 [pdf]**
*replaced on 2017-06-19 04:02:19*

**Authors:** Preobrazhenskiy Andrey

**Comments:** 13 Pages.

ABSTRACT. This paper deals with the analysis of physically possible constructions of a viscous incompressible fluid model. Physical principles that allow to create the only possible construction of this model were found. The new model does not use new constants that characterize properties of the fluid and coincides with the Stokes model only in the plane case. Within the framework of this model, new equations for fluid motion were obtained. The new equations coincide with Navier-Stokes system in the plane case, but do not coincide in the three-dimensional one. The model makes it possible to see why the three-dimensional Navier-Stokes equations cannot physically adequately describe fluids motion, and obliquely confirms the finite time for the existence of its regular solutions.

**Category:** Mathematical Physics

[311] **viXra:1705.0346 [pdf]**
*submitted on 2017-05-23 12:05:10*

**Authors:** Sudipto Roy

**Comments:** 7 pages, 2 figures

The time dependence of the equation of state (EoS) parameter of the cosmic fluid, for a space of zero curvature, has been determined in the framework of the Brans-Dicke (BD) theory of gravity, using FRW metric. For this purpose, empirical expressions of the scale factor, scalar field and the dimensionless BD
parameter have been used. The constant parameters involved in these expressions have been determined from the field equations. The dependence of the scalar field upon the scale factor and the dependence of the BD parameter upon the scalar field have been explored to determine the time dependence of the EoS parameter. Its rate of change with time has been found to depend upon a parameter that governs the time dependent behavior of the scalar field. Time dependence of the EoS parameter has been graphically depicted.

**Category:** Relativity and Cosmology

[310] **viXra:1705.0345 [pdf]**
*submitted on 2017-05-23 07:52:50*

**Authors:** Fenton John Doolan

**Comments:** 16 pages

Since Isaac Newton first described gravity as a force of attraction between masses in the late seventeenth century mankind has been trying to explain the mechanism which creates it. Albert Einstein in 1915 proposed that matter tells space and time how to bend in his mathematical theory of General Relativity. Since then scientists have suggested the existence of the graviton a particle that creates the force of attraction between two objects. This paper suggests that gravity is a by-product of electromagnetism. The Sun and the Earth are acting like inverter magnets which creates an attractive and repulsive force.

**Category:** Quantum Physics

[309] **viXra:1705.0344 [pdf]**
*submitted on 2017-05-23 06:48:51*

**Authors:** George Rajna

**Comments:** 41 Pages.

Based on early research involving the storage of movies and documents in DNA, Microsoft is developing an apparatus that uses biology to replace tape drives, researchers at the company say. [22] Our brains are often compared to computers, but in truth, the billions of cells in our bodies may be a better analogy. The squishy sacks of goop may seem a far cry from rigid chips and bundled wires, but cells are experts at taking inputs, running them through a complicated series of logic gates and producing the desired programmed output. [21] At Caltech, a group of researchers led by Assistant Professor of Bioengineering Lulu Qian is working to create circuits using not the usual silicon transistors but strands of DNA. [20] Researchers have introduced a new type of "super-resolution" microscopy and used it to discover the precise walking mechanism behind tiny structures made of DNA that could find biomedical and industrial applications. [19] Genes tell cells what to do—for example, when to repair DNA mistakes or when to die—and can be turned on or off like a light switch. Knowing which genes are switched on, or expressed, is important for the treatment and monitoring of disease. Now, for the first time, Caltech scientists have developed a simple way to visualize gene expression in cells deep inside the body using a common imaging technology. [18] Researchers at The University of Manchester have discovered that a potential new drug reduces the number of brain cells destroyed by stroke and then helps to repair the damage. [17]

**Category:** Physics of Biology

[308] **viXra:1705.0343 [pdf]**
*submitted on 2017-05-22 13:32:15*

**Authors:** Edgar Valdebenito

**Comments:** 4 Pages.

This note presents some formulas for pi constant

**Category:** Number Theory

[307] **viXra:1705.0342 [pdf]**
*submitted on 2017-05-22 13:36:50*

**Authors:** Edgar Valdebenito

**Comments:** 16 Pages.

This note presents some formulas related with the number z=LambertW(i),where LambertW(x) is the Lambert function.

**Category:** Number Theory

[306] **viXra:1705.0341 [pdf]**
*submitted on 2017-05-22 14:39:03*

**Authors:** Arturo Tozzi

**Comments:** 4 Pages.

Here we make clear a striking correlation between Wittgenstein’s Tractatus Logico-Philosophicus, that assesses the logical relationships between language and world, and the successful Perlovsky’s joint language-cognitive model, that assesses the relationships between language by one side, and the knowledge instinct correlated with basic and aesthetic emotions by another side. This allows us to appraise the invaluable but dismissed Tractatus’ content in terms of the neurocomputational, mathematical techniques at hand. Therefore, the second Wittgenstein, who abandoned his previous philosophical framework, was wrong: the human language and the cognitive world can still be assessed in terms on the logic armor described by the Tractatus.

**Category:** General Science and Philosophy

[305] **viXra:1705.0340 [pdf]**
*submitted on 2017-05-22 19:18:05*

**Authors:** Alban Grastien, Enrico Scala

**Comments:** 3 Pages.

The purpose of this document is to show the complexity of verifying the validity of a deterministic conformant plan. We concentrate on a simple version of the conformant planning problem (i.e., one where there is no precondition on the actions and where all conditions are defined as sets of positive or negative facts) in order to show that the complexity does not come from solving a single such formula.

**Category:** Artificial Intelligence

[304] **viXra:1705.0339 [pdf]**
*submitted on 2017-05-22 12:37:32*

**Authors:** George Rajna

**Comments:** 31 Pages.

An international team of physicists has monitored the scattering behavior of electrons in a non-conducting material in real-time. Their insights could be beneficial for radiotherapy. [23] Researchers from the University of Illinois at Urbana-Champaign have demonstrated a new level of optical isolation necessary to advance on-chip optical signal processing. The technique involving light-sound interaction can be implemented in nearly any photonic foundry process and can significantly impact optical computing and communication systems. [22] City College of New York researchers have now demonstrated a new class of artificial media called photonic hypercrystals that can control light-matter interaction in unprecedented ways. [21] Experiments at the Institute of Physical Chemistry of the Polish Academy of Sciences in Warsaw prove that chemistry is also a suitable basis for storing information. The chemical bit, or 'chit,' is a simple arrangement of three droplets in contact with each other, in which oscillatory reactions occur. [20] Researchers at Sandia National Laboratories have developed new mathematical techniques to advance the study of molecules at the quantum level. [19] Correlation functions are often employed to quantify the relationships among interdependent variables or sets of data. A few years ago, two researchers proposed a property-testing problem involving Forrelation for studying the query complexity of quantum devices. [18] A team of researchers from Australia and the UK have developed a new theoretical framework to identify computations that occupy the 'quantum frontier'—the boundary at which problems become impossible for today's computers and can only be solved by a quantum computer. [17] Scientists at the University of Sussex have invented a groundbreaking new method that puts the construction of large-scale quantum computers within reach of current technology. [16] Physicists at the University of Bath have developed a technique to more reliably produce single photons that can be imprinted with quantum information. [15]

**Category:** Condensed Matter

[303] **viXra:1705.0338 [pdf]**
*submitted on 2017-05-22 13:04:04*

**Authors:** Han Geurdes

**Comments:** 14 Pages.

In this paper a tracer prognostic differential equation related to the marine chemistry HAMOC model is studied. Recently, the present author found that the Navier Stokes equation has no exact solution. The following question can therefore be justified. Do numerical solutions from prognostic equations provide unique information about the distribution of nutrients in the ocean.

**Category:** Geophysics

[302] **viXra:1705.0337 [pdf]**
*submitted on 2017-05-22 07:22:25*

**Authors:** George Rajna

**Comments:** 21 Pages.

Researchers have shown how singularities – which are normally only found at the centre of black holes and hidden from view – could exist in highly curved three-dimensional space. [17] A team of scientists at the Tata Institute of Fundamental Research (TIFR), Mumbai, India, have found new ways to detect a bare or naked singularity, the most extreme object in the universe. [16] New data from NASA's Chandra X-ray Observatory and other telescopes has revealed details about this giant black hole, located some 145 million light years from Earth. [15] A team of researchers from around the world is getting ready to create what might be the first image of a black hole. [14] "There seems to be a mysterious link between the amount of dark matter a galaxy holds and the size of its central black hole, even though the two operate on vastly different scales," said Akos Bogdan of the Harvard-Smithsonian Center for Astrophysics (CfA). [13] If dark matter comes in both matter and antimatter varieties, it might accumulate inside dense stars to create black holes. [12] For a long time, there were two main theories related to how our universe would end. These were the Big Freeze and the Big Crunch. In short, the Big Crunch claimed that the universe would eventually stop expanding and collapse in on itself. This collapse would result in…well…a big crunch (for lack of a better term). Think " the Big Bang " , except just the opposite. That's essentially what the Big Crunch is. On the other hand, the Big Freeze claimed that the universe would continue expanding forever, until the cosmos becomes a frozen wasteland. This theory asserts that stars will get farther and farther apart, burn out, and (since there are no more stars bring born) the universe will grown entirely cold and eternally black. [11] Newly published research reveals that dark matter is being swallowed up by dark energy, offering novel insight into the nature of dark matter and dark energy and what the future of our Universe might be. [10] The gravitational force attracting the matter, causing concentration of the matter in a small space and leaving much space with low matter concentration: dark matter and energy. There is an asymmetry between the mass of the electric charges, for example proton and electron, can understood by the asymmetrical Planck Distribution Law. This temperature dependent energy distribution is asymmetric around the maximum intensity, where the annihilation of matter and antimatter is a high probability event. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** Astrophysics

[301] **viXra:1705.0336 [pdf]**
*submitted on 2017-05-22 07:35:58*

**Authors:** Johan Noldus

**Comments:** 3 Pages. temporary notes about laws behind consciousness.

I give away two simple principles indicating the cause for spiritual
disruptions of different kinds of severity.

**Category:** Quantum Physics

[300] **viXra:1705.0335 [pdf]**
*submitted on 2017-05-22 08:00:19*

**Authors:** George Rajna

**Comments:** 30 Pages.

The physicists in Göttingen are part of a German-Italian collaboration which has now published an amazing discovery in Nature Communications: even quantum systems can synchronize through self-organization, without any external control. This synchronization manifests itself in the strangest property of the quantum world – entanglement. [17] The quantum internet, which connects particles linked together by the principle of quantum entanglement, is like the early days of the classical internet – no one can yet imagine what uses it could have, according to Professor Ronald Hanson, from Delft University of Technology, the Netherlands, whose team was the first to prove that the phenomenon behind it was real. [16] Through a collaboration between the University of Calgary, The City of Calgary and researchers in the United States, a group of physicists led by Wolfgang Tittel, professor in the Department of Physics and Astronomy at the University of Calgary have successfully demonstrated teleportation of a photon (an elementary particle of light) over a straight-line distance of six kilometers using The City of Calgary's fiber optic cable infrastructure. [15] Optical quantum technologies are based on the interactions of atoms and photons at the single-particle level, and so require sources of single photons that are highly indistinguishable – that is, as identical as possible. Current single-photon sources using semiconductor quantum dots inserted into photonic structures produce photons that are ultrabright but have limited indistinguishability due to charge noise, which results in a fluctuating electric field. [14] A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy's Oak Ridge National Laboratory. [13] A source of single photons that meets three important criteria for use in quantum-information systems has been unveiled in China by an international team of physicists. Based on a quantum dot, the device is an efficient source of photons that emerge as solo particles that are indistinguishable from each other. The researchers are now trying to use the source to create a quantum computer based on "boson sampling". [11]

**Category:** Quantum Physics

[299] **viXra:1705.0334 [pdf]**
*submitted on 2017-05-22 08:47:38*

**Authors:** George Rajna

**Comments:** 21 Pages.

Astronomers have found tidal tails around a distant globular cluster known as NGC 7492. The newly discovered features could provide important information about the nature of globular clusters. [15] Late last year, an international team including researchers from the Kavli Institute for Astronomy and Astrophysics (KIAA) at Peking University announced the discovery of more than 60 extremely distant quasars, nearly doubling the number known to science-and thus providing dozens of new opportunities to look deep into our universe's history. [14] Fuzzy pulsars orbiting black holes could unmask quantum gravity. [13] Cosmologists trying to understand how to unite the two pillars of modern science – quantum physics and gravity – have found a new way to make robust predictions about the effect of quantum fluctuations on primordial density waves, ripples in the fabric of space and time. [12] Physicists have performed a test designed to investigate the effects of the expansion of the universe—hoping to answer questions such as "does the expansion of the universe affect laboratory experiments?", "might this expansion change the lengths of solid objects and the time measured by atomic clocks differently, in violation of Einstein's equivalence principle?", and "does spacetime have a foam-like structure that slightly changes the speed of photons over time?", an idea that could shed light on the connection between general relativity and quantum gravity. [11] Einstein's equivalence principle states that an object in gravitational free fall is physically equivalent to an object that is accelerating with the same amount of force in the absence of gravity. This principle lies at the heart of general relativity and has been experimentally tested many times. Now in a new paper, scientists have experimentally demonstrated a conceptually new way to test the equivalence principle that could detect the effects of a relatively new concept called spin-gravity coupling. [10] A recent peer-reviewed paper by physicist James Franson from the University of Maryland in the US has initiated a stir among physics community. Issued in the New Journal of Physics, the paper points to evidence proposing that the speed of light as defined by the theory of general relativity, is slower than originally thought. [9] Gravitational time dilation causes decoherence of composite quantum systems. Even if gravitons are there, it's probable that we would never be able to perceive them. Perhaps, assuming they continue inside a robust model of quantum gravity, there may be secondary ways of proving their actuality. [7] The magnetic induction creates a negative electric field, causing an electromagnetic inertia responsible for the relativistic mass change; it is the mysterious Higgs Field giving mass to the particles. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The self maintained electric potential of the accelerating charges equivalent with the General Relativity space-time curvature, and since it is true on the quantum level also, gives the base of the Quantum Gravity. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory.

**Category:** Astrophysics

[298] **viXra:1705.0333 [pdf]**
*submitted on 2017-05-22 09:05:28*

**Authors:** Alexander Bolonkin

**Comments:** 159 Pages.

New macro-projects, concepts, ideas, methods, and innovations are explored here, but hardly developed. There remain many problems that must be researched, modeled, and tested before these summarized research ideas can be practically designed, built, and utilized—that is, fully developed and utilized.
Most ideas in our book are described in the following way: 1) Description of current state in a given field of endeavor. A brief explanation of the idea researched, including its advantages and short comings. No eny equations. Some of them have only the results of estimation and computations. But all ideas contains the links in initial scientific articles contains detail proofs, equtions and computations. 2) A brief description of possible applications—candidate macro-projects, including estimations of the main physical parameters of such economic developmental undertakings.
The parts are in a popular form accessible to the wider reading public. The many original articles of this book will require some mathematical and scientific knowledge, such as may be found amongst technical school graduate students.
The book gives the main physical data which will help researchers, engineers, dedicated students and enthusiastic readers make estimations for their own macro-projects. Also, inventors will find an extensive field of inventions and innovations revealed in our book.

**Category:** Classical Physics

[297] **viXra:1705.0332 [pdf]**
*replaced on 2017-05-29 09:42:18*

**Authors:** Sylwester Kornowski

**Comments:** 8 Pages.

Here we showed that the Scale-Symmetric Theory (SST) gives rise to the Standard Model (SM) of particle physics. We calculated the SM gauge couplings - we obtained g’ = 0.35706, g = 0.65235 (these two gauge couplings lead to an illusion of electroweak unification), and g(s) = 1.21529 +- 0.00360. We as well described the mechanism that leads to the mass of muon. Calculated here mass of muon is 105.6576 MeV. The other SM parameters we calculated in earlier papers. SST is based on 7 parameters only which, contrary to SM, lead also to the 3 masses of neutrinos (they are beyond SM) and to the 4 basic physical constants (i.e. to the reduced Planck constant, to gravitational constant (gravity is beyond SM), to speed of light in “vacuum” and electric charge of electron). We can see that in SST there is 2.7 times less parameters, SST leads to the 19 initial parameters in SM, and SST describes phenomena beyond SM. It leads to conclusion that SST is a more fundamental theory than SM.

**Category:** High Energy Particle Physics

[296] **viXra:1705.0331 [pdf]**
*submitted on 2017-05-22 05:05:19*

**Authors:** George Rajna

**Comments:** 17 Pages.

Quantum light emitters, or quantum dots, are of interest for many different applications, including quantum communication and networks. [12] A source of single photons that meets three important criteria for use in quantum-information systems has been unveiled in China by an international team of physicists. Based on a quantum dot, the device is an efficient source of photons that emerge as solo particles that are indistinguishable from each other. The researchers are now trying to use the source to create a quantum computer based on "boson sampling". [11] With the help of a semiconductor quantum dot, physicists at the University of Basel have developed a new type of light source that emits single photons. For the first time, the researchers have managed to create a stream of identical photons. [10] Optical photons would be ideal carriers to transfer quantum information over large distances. Researchers envisage a network where information is processed in certain nodes and transferred between them via photons. [9] While physicists are continually looking for ways to unify the theory of relativity, which describes large-scale phenomena, with quantum theory, which describes small-scale phenomena, computer scientists are searching for technologies to build the quantum computer using Quantum Information. In August 2013, the achievement of "fully deterministic" quantum teleportation, using a hybrid technique, was reported. On 29 May 2014, scientists announced a reliable way of transferring data by quantum teleportation. Quantum teleportation of data had been done before but with highly unreliable methods. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to build the Quantum Computer with the help of Quantum Information.

**Category:** Quantum Physics

[295] **viXra:1705.0330 [pdf]**
*submitted on 2017-05-22 05:15:16*

**Authors:** Sai Venkatesh Balasubramanian

**Comments:** 11 Pages.

This article explores the nonlinear aspects underlying music , particularly focusing on melody. By using the concept of scale as the basis, the article explores ways to formulate and study the features and 'feature richness' of a given melody or Raga, and to do this, the Raga scale is represented as a 1-Dimensional array. The Signature graph of a Raga plotted as Interval as a function of Note position, established a graphic visualization of the Raga. The progression and trend of intervals was computed using the Second Level Interval Array. This trend graph reveals the complexity in a Raga structure, through looping, crowded and intricate curves in the graph. Next, the concept of chaos in the context of melody is explored, fundamentally by performing a sensitivity test, which analyzes that given a Raga, and a particular evolution path, how starting at two nearby Swaras results in two entirely different ending Swaras, when sampled after a certain period in time. As a measure of the complexity in a Raga, the entropy, a measure of uncertainty is proposed, and computed using the interval arrays as bases for an occurrence array yielding empirical probabilities. The entropy is seen as a measure of richness, a measure of variety of inter-Swara intervals that a given Raga possesses. One notes that Ragas with high entropy, on account of their interval richness, usually fall under the category of pleasant, appealing and melodious Ragas. These are also the Ragas one finds being employed in film music, clearly owing to their pleasant feel.

**Category:** Mathematical Physics

[294] **viXra:1705.0329 [pdf]**
*submitted on 2017-05-22 06:51:57*

**Authors:** George Rajna

**Comments:** 24 Pages.

Achieving magnetic order in low-dimensional systems consisting of only one or two dimensions has been a research goal for some time. [34] The electron microscope, a powerful tool for science, just became even more powerful, with an improvement developed by Cornell physicists. Their electron microscope pixel array detector (EMPAD) yields not just an image, but a wealth of information about the electrons that create the image and, from that, more about the structure of the sample. [33] An innovative new technique to produce the quickest, smallest, highest-capacity memories for flexible and transparent applications could pave the way for a future golden age of electronics. [32] The shrinking of electronic components and the excessive heat generated by their increasing power has heightened the need for chip-cooling solutions, according to a Rutgers-led study published recently in Proceedings of the National Academy of Sciences. Using graphene combined with a boron nitride crystal substrate, the researchers demonstrated a more powerful and efficient cooling mechanism. [31] Materials like graphene can exhibit a particular type of large-amplitude, stable vibrational modes that are localised, referred to as Discrete Breathers (DBs). [30] A two-dimensional material developed by Bayreuth physicist Prof. Dr. Axel Enders together with international partners could revolutionize electronics. [29] Researchers have found a way to trigger the innate, but previously hidden, ability of graphene to act as a superconductor-meaning that it can be made to carry an electrical current with zero resistance. [28] Researchers in Japan have found a way to make the 'wonder material' graphene superconductive-which means electricity can flow through it with zero resistance. The new property adds to graphene's already impressive list of attributes, like the fact that it's stronger than steel, harder than diamond, and incredibly flexible. [27] Superconductivity is a rare physical state in which matter is able to conduct electricity—maintain a flow of electrons—without any resistance. It can only be found in certain materials, and even then it can only be achieved under controlled conditions of low temperatures and high pressures. New research from a team including Carnegie's Elissaios Stavrou, Xiao-Jia Chen, and Alexander Goncharov hones in on the structural changes underlying superconductivity in iron arsenide compounds—those containing iron and arsenic. [26] This paper explains the magnetic effect of the superconductive current from the observed effects of the accelerating electrons, causing naturally the experienced changes of the electric field potential along the electric wire. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the wave particle duality and the electron's spin also, building the bridge between the Classical and Quantum Theories. The changing acceleration of the electrons explains the created negative electric field of the magnetic induction, the Higgs Field, the changing Relativistic Mass and the Gravitational Force, giving a Unified Theory of the physical forces. Taking into account the Planck Distribution Law of the electromagnetic oscillators also, we can explain the electron/proton mass rate and the Weak and Strong Interactions.

**Category:** Condensed Matter

[293] **viXra:1705.0328 [pdf]**
*submitted on 2017-05-21 13:47:48*

**Authors:** George Rajna

**Comments:** 39 Pages.

New research proposes a way to test whether quantum entanglement is affected by consciousness. [26] Using atomic-scale quantum defects in diamonds known as nitrogen-vacancy (NV) centers to detect the magnetic field generated by neural signals, scientists working in the lab of Ronald Walsworth, a faculty member in Harvard's Center for Brain Science and Physics Department, demonstrated a noninvasive technique that can image the activity of neurons. [25] Neuroscience and artificial intelligence experts from Rice University and Baylor College of Medicine have taken inspiration from the human brain in creating a new "deep learning" method that enables computers to learn about the visual world largely on their own, much as human babies do. [24]

**Category:** Mind Science

[292] **viXra:1705.0326 [pdf]**
*submitted on 2017-05-21 15:19:36*

**Authors:** Luca Nascimbene

**Comments:** 10 Pages.

This particle that take my last name was discovered in the lab by using electronic instrumentation with various electronic components,Applying a physical formula I invented corresponds to a new form of particles.
The electronic instrumentation i have use
Oscilloscope, amplificator operational (LM741),capacitor 100nf, 2 resistor 1kohm and sensor photomultipler → rilevator of particle and photodiode

**Category:** Classical Physics

[291] **viXra:1705.0325 [pdf]**
*submitted on 2017-05-21 15:22:44*

**Authors:** Bill Gaede

**Comments:** 14 Pages.

The wave model of light was born in the 17th Century and was quickly abandoned in favor of the old Corpuscular Hypothesis on the strength of Newton’s authority. It flourished again in the 19th Century only to be eclipsed once again by the Corpuscular Hypothesis at the turn of the century. The participants at the 5th Solvay Conference reached a compromise in 1926 and finally merged the wave and the corpuscle into an unfathomable concoction known as ‘wave-packet’. This is the official model today, but now it rests on the authority of Niels Bohr. However, the Wave-Packet Hypothesis is not about architecture. The mathematical establishment has turned the argument upside down and incongruously states that light ‘behaves’ as a wave or as a particle depending on the circumstances. There is, therefore, no formal physical configuration of light in Mathematical Physics that a theorist can challenge. Many in the establishment even argue that a mediator is unnecessary and dispose of one entirely in their talks. We compare the wave, particle, field, and wave-packet models championed by Classical Mechanics, Quantum Mechanics, and General Relativity against the Rope Hypothesis to underscore that a new paradigm has emerged in the centuries-old debate.

**Category:** Nuclear and Atomic Physics

[290] **viXra:1705.0324 [pdf]**
*submitted on 2017-05-21 16:06:50*

**Authors:** Eric Su

**Comments:** 2 Pages. microwave speed relativity reference frame

A standing wave consists of two identical waves moving in opposite
direction. A frequency detector moving toward the standing wave will
detect two different frequencies. One is blueshifted, the other is
redshifted. The distance between two adjacent nodes in the standing
wave is equal to half of the wavelength of both waves. Consequently, the wave
detector will detect different speeds from both waves due to the same
wavelength and the different frequencies. The calculation of speed is demonstrated with a typical household microwave oven which emits
microwave of frequency range around 2.45 GHz and wavelength range around 12.2 cm.

**Category:** Relativity and Cosmology

[289] **viXra:1705.0323 [pdf]**
*submitted on 2017-05-21 16:24:20*

**Authors:** Mark Krinker, Galina Pana

**Comments:** 10 Pages.

The paper deals with a synergistic effect caused by a contact of two persons. New originated system manifests itself in spikes of energy and oscillating processes. The effective energy of the process for various combinations of human pairs has been calculated.

**Category:** Physics of Biology

[288] **viXra:1705.0322 [pdf]**
*submitted on 2017-05-21 18:45:35*

**Authors:** Koji Nagata, Tadao Nakamura, Han Geurdes, Ahmed Farouk, Josep Batle, Soliman Abdalla, Germano Resconi

**Comments:** 4 Pages

We present
a new quantum algorithm. It determines the property of a function.
It is $f(x)=f(-x)$.
How fast can we succeed?
The quantum algorithm does not use the Hadamard transformation.
All we need is of evaluating $|\overbrace{0,0,...,1}^N\rangle$.
And we can know the global property, that is,
we can realize $f(x)=f(-x)$
for numbers.
Our quantum algorithm overcomes a classical counterpart
by a factor of $O(2^N)$.

**Category:** Quantum Physics

[287] **viXra:1705.0321 [pdf]**
*submitted on 2017-05-21 19:45:25*

**Authors:** Andrew Beckwith

**Comments:** 6 Pages.

First we review what was done by Klauber, in his quantum field theory calculation of the Vacuum energy density, and in doing so, use, instead of Planck Mass, which has 10^19 GeV, which leads to an answer 10 ^ 122 times too large, a cut off value of instead, a number, N, of gravitons , times graviton mass (assumed to be about 10^-43 GeV) to get a number, N, count of about 10^31 if the vacuum energy is to avoid an overshoot of 10^122, and instead have a vacuum energy 10^-47GeV^4. Afterwards, we use the results of Mueller and Lousto, to compare the number N, of 10^31, assumed to be entropy using Ng’ infinite quantum statistics , to the ratio of the square of (the Hubble (observational ) radius over a calculated grid size which we call a) . Here, a ~ a minimum time step we call delta t, times the speed of light. Now in doing so we use a root finder procedure to obtain were we use an inflaton value due to use of a scale factor if we furthermore use as the variation of the time component of the metric tensor in Pre-Planckian Space-time up to the Planckian space-time initial values.

**Category:** Quantum Gravity and String Theory

[286] **viXra:1705.0320 [pdf]**
*submitted on 2017-05-21 21:54:46*

**Authors:** Yanming Wei

**Comments:** 6 pages, 0 figure. DOI: 10.13140/RG.2.2.29478.73286

The Russian scientists D.V. Filippov and L.I. Urutskoev pioneered experimental research and theory exploration and they named such reactions as transformation, or C-LENR (Collective Low Energy Nuclear Reaction). In this paper, I present some comments for the intrinsic mechanism, and at last, my conjecture is proposed for alternative explanation on the overunity phenomenon in Graneau’s water explosion experiment.

**Category:** Nuclear and Atomic Physics

[285] **viXra:1705.0319 [pdf]**
*submitted on 2017-05-21 23:04:19*

**Authors:** Guillermo A Rios

**Comments:** 33 Pages.

This philosophical essay contrasts the difference in industrial/social development between the Latin-American nations and those of the developed First World, and attributes the origin of such differences to the lack of relevance that the teaching of Adam Smith have been given in those nations by their Academia and political classes. It starts with a good summary of Smith economic ideas as expressed in his book "The Wealth of Nations." Then some of those ideas are extrapolated, such as analyzing the survival value of the "trading gene" present in our species and no other (as Adam Smith keenly observed,) and also introducing the conception of modeling world wide trading as a artificial neural network, leading to the concept of human "supraintelligence" as the product of such network. Also, the philosophical battle against those that along history have sought to equate poverty with virtue, is examined, as well as the philosophical duel at the times of the Economics Enlightenment between Jean-Jacques Rousseau and Voltaire, another intellectual giant rather ignored by Latin-American Academia. At the end we retake Adam Smith ideas as the conclusion of such philosophical duel.

**Category:** General Science and Philosophy

[284] **viXra:1705.0318 [pdf]**
*submitted on 2017-05-22 00:14:36*

**Authors:** Temur Z. Kalanov

**Comments:** 26 Pages.

The correct scientific and critical analysis of the generally accepted foundations of classical mechanics is proposed. The methodological basis for the analysis is the unity of formal logic and of rational dialectics. The main results of the analysis are as follows: (1) the correct starting point of kinematics is formulated: the informational definition of the concept of time; definitions of the concepts of motion, speed, and acceleration of material point in the metric system of coordinates; the principle of motion of quantum particle (photon); proof of the mathematical, physical, and formal-logical erroneousness (fallaciousness) of Lorentz transformations; (2) the correct starting point of dynamics is formulated: the definition of force as a physical property of the structure of the system of the interacting objects; (3) the correct starting point of the theory of gravitation is formulated: the condition of existence of the gravitational interaction which represents the condition of existence of the region of overlap (superposition, intersection) of the gravitational fields of the material objects; (4) the correct formulation of the law of gravitation within the framework of the system approach is given (the formulation represents the system of the proportions); (5) it is proved that the formulation of Newton’s empirical law of gravitation represents the formal-logical and dialectical errors.

**Category:** Classical Physics

[283] **viXra:1705.0317 [pdf]**
*submitted on 2017-05-22 02:14:41*

**Authors:** Antonio Puccini

**Comments:** 3 Pages.

As known the Weak Nuclear Force (WNF) acts between quarks (Qs) and leptons. The action of the WNF is mediated by highly massive gauge bosons. How does a Q emit such a massive particle, approximately 16.000 or 40.000 times its mass? Who provides so much energy to a up Q or a down Q? However, it must be considered that according to Quantum Mechanics it is possible to loan temporarily some energy, but to a precise and binding condition, established by the Uncertainty Principle: the higher the energy borrowed, the shorter the duration of the loan. Our calculations show that the maximum distance these bosons can travel, i.e. the upper limit of their range, corresponds to 1.54310-15 [cm] for particles W+ and W- and 1.3610-15[cm] for Z° particles.

**Category:** Nuclear and Atomic Physics

[282] **viXra:1705.0316 [pdf]**
*submitted on 2017-05-22 02:58:33*

**Authors:** Nikitina N.N., Nikitin V.N., Nikitin I.V.

**Comments:** 1 Page.

По мере движения галактик к окраинам Вселенной гравитация будет возрастать и, в конечном итоге, победит, раздавит всё и превратится в «ничего».

**Category:** Astrophysics

[281] **viXra:1705.0314 [pdf]**
*submitted on 2017-05-21 07:59:43*

**Authors:** George Rajna

**Comments:** 14 Pages.

Statistical analysis of mini-spiral galaxies shows an unexpected interaction between dark matter and ordinary matter. [14] "The best result on dark matter so far—and we just got started." This is how scientists behind XENON1T, now the most sensitive dark matter experiment worldwide , commented on their first result from a short 30-day run presented today to the scientific community. [13] The gravitational force attracting the matter, causing concentration of the matter in a small space and leaving much space with low matter concentration: dark matter and energy. There is an asymmetry between the mass of the electric charges, for example proton and electron, can understood by the asymmetrical Planck Distribution Law. This temperature dependent energy distribution is asymmetric around the maximum intensity, where the annihilation of matter and antimatter is a high probability event. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter. SIMPs would resolve certain discrepancies between simulations of the distribution of dark matter, like this one, and the observed properties of the galaxies. In particle physics and astrophysics, weakly interacting massive particles, or WIMPs, are among the leading hypothetical particle physics candidates for dark matter.

**Category:** Astrophysics

[280] **viXra:1705.0313 [pdf]**
*submitted on 2017-05-21 09:43:28*

**Authors:** George Rajna

**Comments:** 32 Pages.

It's man vs machine this week as Google's artificial intelligence programme AlphaGo faces the world's top-ranked Go player in a contest expected to end in another victory for rapid advances in AI. [19]
Google's computer programs are gaining a better understanding of the world, and now it wants them to handle more of the decision-making for the billions of people who use its services. [18]
Microsoft on Wednesday unveiled new tools intended to democratize artificial intelligence by enabling machine smarts to be built into software from smartphone games to factory floors. [17]
The closer we can get a machine translation to be on par with expert human translation, the happier lots of people struggling with translations will be. [16]
Researchers have created a large, open source database to support the development of robot activities based on natural language input. [15]
A pair of physicists with ETH Zurich has developed a way to use an artificial neural network to characterize the wave function of a quantum many-body system. [14]
A team of researchers at Google's DeepMind Technologies has been working on a means to increase the capabilities of computers by combining aspects of data processing and artificial intelligence and have come up with what they are calling a differentiable neural computer (DNC.) In their paper published in the journal Nature, they describe the work they are doing and where they believe it is headed. To make the work more accessible to the public team members, Alexander Graves and Greg Wayne have posted an explanatory page on the DeepMind website. [13]
Nobody understands why deep neural networks are so good at solving complex problems. Now physicists say the secret is buried in the laws of physics. [12]
A team of researchers working at the University of California (and one from Stony Brook University) has for the first time created a neural-network chip that was built using just memristors. In their paper published in the journal Nature, the team describes how they built their chip and what capabilities it has. [11]
A team of researchers used a promising new material to build more functional memristors, bringing us closer to brain-like computing. Both academic and industrial laboratories are working to develop computers that operate more like the human brain. Instead of operating like a conventional, digital system, these new devices could potentially function more like a network of neurons. [10]
Cambridge Quantum Computing Limited (CQCL) has built a new Fastest Operating System aimed at running the futuristic superfast quantum computers. [9]
IBM scientists today unveiled two critical advances towards the realization of a practical quantum computer. For the first time, they showed the ability to detect and measure both kinds of quantum errors simultaneously, as well as demonstrated a new, square quantum bit circuit design that is the only physical architecture that could successfully scale to larger dimensions. [8]
Physicists at the Universities of Bonn and Cambridge have succeeded in linking two completely different quantum systems to one another. In doing so, they have taken an important step forward on the way to a quantum computer. To accomplish their feat the researchers used a method that seems to function as well in the quantum world as it does for us people: teamwork. The results have now been published in the "Physical Review Letters". [7]
While physicists are continually looking for ways to unify the theory of relativity, which describes large-scale phenomena, with quantum theory, which describes small-scale phenomena, computer scientists are searching for technologies to build the quantum computer.
The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron’s spin also, building the Bridge between the Classical and Quantum Theories.
The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry.
The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to build the Quantum Computer.

**Category:** Artificial Intelligence

[279] **viXra:1705.0312 [pdf]**
*submitted on 2017-05-21 10:34:05*

**Authors:** George Rajna

**Comments:** 23 Pages.

Of the many 'white whales' that theoretical physicists are pursuing, the elusive magnetic monopole-a magnetic with only one pole-is one of the most confounding. [14] The transformation of a quantum monopole into a Dirac monopole has been observed for the first time by physicists at Amherst College in the US and Aalto University in Finland. [13] Scientists at Amherst College (USA) and Aalto University (Finland) have made the first experimental observations of the dynamics of isolated monopoles in quantum matter. [12] Building on his own previous research, Amherst College professor David S. Hall '91 and a team of international collaborators have experimentally identified a pointlike monopole in a quantum field for the first time. The discovery, announced this week, gives scientists further insight into the elusive monopole magnet, an elementary particle that researchers believe exists but have not yet seen in nature. [11] For the first time, physicists have achieved interference between two separate atoms: when sent towards the opposite sides of a semi-transparent mirror, the two atoms always emerge together. This type of experiment, which was carried out with photons around thirty years ago, had so far been impossible to perform with matter, due to the extreme difficulty of creating and manipulating pairs of indistinguishable atoms. [10] The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** Quantum Physics

[278] **viXra:1705.0311 [pdf]**
*submitted on 2017-05-21 06:18:46*

**Authors:** Wan-Chung Hu

**Comments:** 5 Pages.

Here, I will use Higgs mechanism to unite gluons and photon to explain the origin of mass of gluons in strong interaction. This is the electro-strong unification which can explain the mass of neutron and proton.

**Category:** High Energy Particle Physics

[277] **viXra:1705.0310 [pdf]**
*submitted on 2017-05-20 13:20:43*

**Authors:** George Rajna

**Comments:** 34 Pages.

While studying the underpinnings of multiple sclerosis, investigators at Brigham and Women's Hospital came across important clues for how to treat a very different disease: cancer. [21] A major challenge in truly targeted cancer therapy is cancer's suppression of the immune system. Northwestern University synthetic biologists now have developed a general method for "rewiring" immune cells to flip this action around. [20] Scientists at the University of Bonn have succeeded in observing an important cell protein at work using a method that measures structural changes within complex molecules. [19] Scientists have now explored a modified form that can produce light-generated electrons and store them for catalytic hydrogen production even after the light has been switched off. They present this biomimetic photosynthesis approach in the journal Angewandte Chemie. [18] Scientists at The Australian National University (ANU) have designed a nano crystal around 500 times smaller than a human hair that turns darkness into visible light and can be used to create lightweight night-vision glasses. [17] Magnets instead of antibiotics could provide a possible new treatment method for blood infection. [16] One of the biggest challenges in cognitive or rehabilitation neurosciences is the ability to design a functional hybrid system that can connect and exchange information between biological systems, like neurons in the brain, and human-made electronic devices. [15] Wearable terahertz scanning device for inspection of medical equipment and the human body. [14] Optical microscopy experts at Colorado State University are once again pushing the envelope of biological imaging. [13] Researchers at the University of Melbourne have developed a way to radically miniaturise a Magnetic Resonance Imaging (MRI) machine using atomic-scale quantum computer technology. [12] With one in two Australian children reported to have tooth decay in their permanent teeth by age 12, researchers from the University of Sydney believe they have identified some nanoscale elements that govern the behaviour of our teeth. [11]

**Category:** Physics of Biology

[276] **viXra:1705.0309 [pdf]**
*replaced on 2017-05-23 15:24:32*

**Authors:** Roman Vinokur

**Comments:** 4 Pages. Replaced.

The human factor is very important if the high-level R&D skills are needed (“Intellectuals solve problems, geniuses prevent them.” - Albert Einstein). Many people believe more in the power of teamwork than in “myths” of the lone genius. However, if a project goal can be successfully achieved just by one high-skilled professional, it is not wise to delegate this duty to a group of low-skilled persons (“A great engineer is worth 100 average engineers.” - Facebook CEO Mark Zuckerberg. Generally, the dilemma “ordinary team or single expert” is of frequent interest for R&D organizations.
In this paper, an approximate mathematical approach has been developed to estimate the critical (minimum) size of an ordinary R&D team that can be more successful than a single expert. As a result, this method could help to resolve the “hire or not hire” issue.

**Category:** General Mathematics

[275] **viXra:1705.0308 [pdf]**
*replaced on 2017-08-21 17:22:03*

**Authors:** Peter V. Raktoe

**Comments:** 4 Pages.

There is a reason why general relativity cannot be unified with quantum mechanics, physicists don't realize that Einstein's reason for gravity is not real. Einstein's gravity is a mathematical gravity, you cannot unify something that is based on mathematical fiction (general relativity) with reality (quantum mechanics). I will show you how I unified general relativity with quantum mechanics, I was able to do it because I found the origin of gravity and time.

**Category:** Quantum Physics

[274] **viXra:1705.0307 [pdf]**
*submitted on 2017-05-21 00:53:45*

**Authors:** Fang Zhou

**Comments:** 10 pages in Chinese

The article presents several physical models for thinking experiment to evidence the Absolute of simultaneity and non-simultaneity of events in reference frames. The Time Transformation of Zhoufang Transformation (Z-Transformation) actually underlies the Absolute of simultaneity and non-simultaneity of events in reference frames in constant-speed translational motion.

**Category:** Relativity and Cosmology

[273] **viXra:1705.0306 [pdf]**
*submitted on 2017-05-21 03:06:51*

**Authors:** Yin Zhu

**Comments:** 17 Pages.

1.All of the conclusions are directly from or based on the arguments and discussions in the researchgate.net. Sometimes, the arguments are very intense and sharp. Till now, these conclusions seem solidly standing.
2.The problems discussed here are easily understood. Only the simple mathematics is used. From the literatures, it is clearly shown that Einstein did have no original work in the theory of relativity. It just is a faked story that Einstein independently presented all the conclusions of the theory of relativity.
3.The conclusions are radically violent. It is certainly declared that Einstein’s theory of relativity is only pseudoscience and the pseudoscience is produced from the anti-ethics: Einstein did have no original work in the theory of relativity. But, now, all of the results in the theory of relativity are ascribed to Einstein.
4.General readers may not believe the conclusions. But, as you know that the theory of relativity is filled with faked stories, you should know what it means.
5.The arguments in the researchgate.net showed that, at least, no relativist can disprove the conclusions. And, some of the rational relativists have to agree some of these conclusions. But, most of the relativists only can select silent.
6.Maybe, relativists will close their eyes on the conclusions to continue the declaration that the theory of relativity is a great theory as they did in the past. But, now, it is the internet time. The conclusions cannot be concealed and shielded as did before internet time. Now, they can be generally and quickly spread and transformed. Relativists shall lose their public credibility quickly if they could not have a valid response.
7.We clearly know, it is risk to criticize Einstein. But, the scientists, including relativists, want to know scientific truth. And, we believe, we have the clear and simple arguments. Therefore, we hope, some of the mainstream physicists should accept the conclusions.

**Category:** Relativity and Cosmology

[272] **viXra:1705.0305 [pdf]**
*submitted on 2017-05-20 11:37:01*

**Authors:** George Rajna

**Comments:** 14 Pages.

Each galaxy has a fado—a narrative of its biography since the birth of its first stars. This fate is written in its electromagnetic spectrum, which contains the fossil records of multiple stellar populations that formed over several billion years, as well as the gas that those stars ionize with their radiation. [8] The Earth is constantly jostled by low-frequency gravitational waves from supermassive black hole binaries in distant galaxies. Astrophysicists are using pulsars as a galaxy-sized detector to measure the Earth's motion from these waves. [7] Last week's announcement that Gravitational Waves (GW) have been detected for the first time—as a result of the merger of two black holes—is huge news. But now a Gamma Ray Burst (GRB) originating from the same place, and that arrived at Earth 0.4 seconds after the GW, is making news. Isolated black holes aren't supposed to create GRB's; they need to be near a large amount of matter to do that. [6] In a landmark discovery for physics and astronomy, international scientists said Thursday they have glimpsed the first direct evidence of gravitational waves, or ripples in space-time, which Albert Einstein predicted a century ago. [5] Scientists at the National Institute for Space Research in Brazil say an undiscovered type of matter could be found in neutron stars (illustration shown). Here matter is so dense that it could be 'squashed' into strange matter. This would create an entire 'strange star'-unlike anything we have seen. [4] The changing acceleration of the electrons explains the created negative electric field of the magnetic induction, the electromagnetic inertia, the changing relativistic mass and the Gravitational Force, giving a Unified Theory of the physical forces. Taking into account the Planck Distribution Law of the electromagnetic oscillators also, we can explain the electron/proton mass rate and the Weak and Strong Interactions.

**Category:** Astrophysics

[271] **viXra:1705.0304 [pdf]**
*submitted on 2017-05-20 07:42:57*

**Authors:** LeiGuanji

**Comments:** 6 Pages.

Abstract:From Einstein,we knowed the relation between gravity and time.We know time and entropy have a relation from thermodynamics.So gravity must have a relation with entropy.In fact,some experments have proved it.So I write this paper to describe this relation between gravity and entropy.

**Category:** Quantum Gravity and String Theory

[270] **viXra:1705.0302 [pdf]**
*replaced on 2018-03-08 13:01:02*

**Authors:** John A. Gowan, August T. Jaccaci

**Comments:** 1 Page. emphasizing energy conservation

A semiotic model in the geometric form of a tetrahedron is used to represent relationships between essential physical principles in a minimal "Theory of Everything".

**Category:** Quantum Gravity and String Theory

[269] **viXra:1705.0301 [pdf]**
*submitted on 2017-05-20 09:03:43*

**Authors:** George Rajna

**Comments:** 12 Pages.

Last week, the detectors of the Large Hadron Collider (LHC) witnessed their first collisions of 2017. [8] As physicists were testing the repairs of LHC by zipping a few spare protons around the 17 mile loop, the CMS detector picked up something unusual. The team feverishly pored over the data, and ultimately came to an unlikely conclusion—in their tests, they had accidentally created a rainbow universe. [7] The universe may have existed forever, according to a new model that applies quantum correction terms to complement Einstein's theory of general relativity. The model may also account for dark matter and dark energy, resolving multiple problems at once. [6] This paper explains the Accelerating Universe, the Special and General Relativity from the observed effects of the accelerating electrons, causing naturally the experienced changes of the electric field potential along the moving electric charges. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the wave particle duality and the electron's spin also, building the bridge between the Classical and Relativistic Quantum Theories. The Big Bang caused acceleration created the radial currents of the matter and since the matter composed of negative and positive charges, these currents are creating magnetic field and attracting forces between the parallel moving electric currents. This is the gravitational force experienced by the matter, and also the mass is result of the electromagnetic forces between the charged particles. The positive and negative charged currents attracts each other or by the magnetic forces or by the much stronger electrostatic forces. The gravitational force attracting the matter, causing concentration of the matter in a small space and leaving much space with low matter concentration: dark matter and energy.

**Category:** High Energy Particle Physics

[268] **viXra:1705.0300 [pdf]**
*replaced on 2017-05-23 08:56:56*

**Authors:** Miroslav Josipović

**Comments:** 80 Pages. geometric algebra

This is the translation of the article "Multiplication of Vectors and Structure of 3D Euclidean Space" to Croatian.

**Category:** Mathematical Physics

[267] **viXra:1705.0299 [pdf]**
*submitted on 2017-05-20 10:04:50*

**Authors:** Norberto Meyer Robsrto Morales

**Comments:** 2 Pages.

se reescribe la ley de newton según un enfoque más matemático.

**Category:** Classical Physics

[266] **viXra:1705.0298 [pdf]**
*submitted on 2017-05-20 10:18:25*

**Authors:** George Rajna

**Comments:** 42 Pages.

Rutgers researchers have developed a new way to analyze hundreds of thousands of cells at once, which could lead to faster and more accurate diagnoses of illnesses, including tuberculosis and cancers. [24] An international team including researchers from MIPT has shown that iodide phasing—a long-established technique in structural biology—is universally applicable to membrane protein structure determination. [23] Scientists in Greece have devised a new form of biometric identification that relies on humans' ability to see flashes of light containing just a handful of photons. [22] A research team led by Professor CheolGi Kim has developed a biosensor platform using magnetic patterns resembling a spider web with detection capability 20 times faster than existing biosensors. [21] Researchers at Columbia University have made a significant step toward breaking the so-called "color barrier" of light microscopy for biological systems, allowing for much more comprehensive, system-wide labeling and imaging of a greater number of biomolecules in living cells and tissues than is currently attainable. [20] Scientists around the Nobel laureate Stefan Hell at the Max Planck Institute for Biophysical Chemistry in Göttingen have now achieved what was for a long time considered impossible – they have developed a new fluorescence microscope, called MINFLUX, allowing, for the first time, to optically separate molecules, which are only nanometers (one millionth of a millimeter) apart from each other. [19] Dipole orientation provides new dimension in super-resolution microscopy [18] Fluorescence is an incredibly useful tool for experimental biology and it just got easier to tap into, thanks to the work of a group of University of Chicago researchers. [17] Molecules that change colour can be used to follow in real-time how bacteria form a protective biofilm around themselves. This new method, which has been developed in collaboration between researchers at Linköping University and Karolinska Institutet in Sweden, may in the future become significant both in medical care and the food industry, where bacterial biofilms are a problem. [16]

**Category:** Physics of Biology

[265] **viXra:1705.0297 [pdf]**
*submitted on 2017-05-20 04:27:04*

**Authors:** Terubumi Honjou

**Comments:** 3 Pages.

If the mystery of the pioneer anomaly is "gravitational action by photon Group of Sunlight",
The mass of the pulsating photon particle process is a perfect candidate for dark matter.
The photon of the particle pulsation hypothesis is a perfect candidate for dark matter.
* Ultra-high-speed and pulsating photons in the particle pulsation hypothesis are moving in the surplus dimension at a very high speed, with great kinetic energy. It is a particle with mass in the surplus dimension though it is a wave of the light of mass zero propagating at the speed of light. This is a particle of Caruia-Klein state. The mass is hidden in the surplus dimension. The photon is full of space, equipped with a sufficient amount to constitute 23% of the universe, and is everywhere in the vacuum space around us. It can be said to be the perfect candidate for dark matter.

**Category:** Astrophysics

[264] **viXra:1705.0296 [pdf]**
*submitted on 2017-05-20 04:45:05*

**Authors:** Ramesh Chandra Bagadi

**Comments:** 3 Pages.

**Category:** Statistics

[263] **viXra:1705.0295 [pdf]**
*submitted on 2017-05-19 14:06:52*

**Authors:** Stephen P. Smith, Cambrian Lopez, Nicole Lam

**Comments:** 15 Pages.

Various DNA testing companies promise their customers a collection of genetic matches to facilitate finding family members. The matches are in centimorgans (cM), where the higher the cM value the closer the relationship to a customer (R). Unless the relationship is close, such as parent-offspring or among 1st cousins, a single cM value is not that informative if the goal is to locate family. This paper describes a statistical method that combines a collection cM values from a cluster of unknown relatives of R, but where the cluster members are known among themselves being for example 2rd and 3th cousins. A presumed envoy is attached to the cluster, where R is a descendant of the envoy, and the various cM values are combined to provide an overall cM value between R and the envoy. The envoy’s cM comes with a statistical error to judge significance. Unlike a single cM value on a typical unknown relative, the envoy’s cM can be quite large and indicative of a real genetic path to R that has previously been undiscovered. This paper describes the method for two sisters, where the path from the envoy led to their lost father, a father that was later discovered.

**Category:** Quantitative Biology

[262] **viXra:1705.0293 [pdf]**
*submitted on 2017-05-19 11:18:33*

**Authors:** Prashanth Rao

**Comments:** 1 Page.

If p is any odd prime number and c is any odd number less than p, then there must exist a positive number c’ less than p, such that cc’= -2modp

**Category:** Number Theory

[261] **viXra:1705.0292 [pdf]**
*submitted on 2017-05-19 07:22:38*

**Authors:** George Rajna

**Comments:** 27 Pages.

IBM scientists have achieved an important milestone toward creating sophisticated quantum devices that could become a key component of quantum computers. [17] While technologies that currently run on classical computers, such as Watson, can help find patterns and insights buried in vast amounts of existing data, quantum computers will deliver solutions to important problems where patterns cannot be seen because the data doesn't exist and the possibilities that you need to explore to get to the answer are too enormous to ever be processed by classical computers. [16] Through a collaboration between the University of Calgary, The City of Calgary and researchers in the United States, a group of physicists led by Wolfgang Tittel, professor in the Department of Physics and Astronomy at the University of Calgary have successfully demonstrated teleportation of a photon (an elementary particle of light) over a straight-line distance of six kilometers using The City of Calgary's fiber optic cable infrastructure. [15] Optical quantum technologies are based on the interactions of atoms and photons at the single-particle level, and so require sources of single photons that are highly indistinguishable – that is, as identical as possible. Current single-photon sources using semiconductor quantum dots inserted into photonic structures produce photons that are ultrabright but have limited indistinguishability due to charge noise, which results in a fluctuating electric field. [14] A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy's Oak Ridge National Laboratory. [13] A source of single photons that meets three important criteria for use in quantum-information systems has been unveiled in China by an international team of physicists. Based on a quantum dot, the device is an efficient source of photons that emerge as solo particles that are indistinguishable from each other. The researchers are now trying to use the source to create a quantum computer based on "boson sampling". [11] With the help of a semiconductor quantum dot, physicists at the University of Basel have developed a new type of light source that emits single photons. For the first time, the researchers have managed to create a stream of identical photons. [10] Optical photons would be ideal carriers to transfer quantum information over large distances. Researchers envisage a network where information is processed in certain nodes and transferred between them via photons. [9] While physicists are continually looking for ways to unify the theory of relativity, which describes large-scale phenomena, with quantum theory, which describes small-scale phenomena, computer scientists are searching for technologies to build the quantum computer using Quantum Information. In August 2013, the achievement of "fully deterministic" quantum teleportation, using a hybrid technique, was reported. On 29 May 2014, scientists announced a reliable way of transferring data by quantum teleportation. Quantum teleportation of data had been done before but with highly unreliable methods. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to build the Quantum Computer with the help of Quantum Information.

**Category:** Condensed Matter

[260] **viXra:1705.0291 [pdf]**
*submitted on 2017-05-19 07:40:16*

**Authors:** Alfredo Dimas Moreira Garcia

**Comments:** 122 Pages.

The Special Theory of Relativity takes us to two results that presently are considered “inexplicable” to many renowned scientists, to know:
-The dilatation of time, and
-The contraction of the Lorentz Length.
The solution to these have driven the author to the development of the Undulating Relativity (UR) theory, where the Temporal variation is due to the differences on the route of the light propagation and the lengths are constants between two landmarks in uniform relative movement

**Category:** Relativity and Cosmology

[259] **viXra:1705.0290 [pdf]**
*submitted on 2017-05-19 07:49:07*

**Authors:** Alfredo Dimas Moreira Garcia

**Comments:** 122 Pages.

A Teoria da Relatividade Especial conduz a dois resultados, considerados incompreensíveis por vários renomados físicos, que são a dilatação do tempo e a denominada contração espacial de Lorentz. A solução desses paradoxos me conduziu ao desenvolvimento da Relatividade Ondulatória onde a variação temporal é devida à diferença nos percursos de propagação da luz e o espaço é constante entre os observadores.
Da análise do desenvolvimento da Relatividade Ondulatória podemos sintetizar as seguintes conclusões:
-é uma teoria com princípios totalmente físicos,
-as transformações são lineares,
-matem intactos os princípios Euclidianos,
-considera a transformação de Galileu distinta em cada referencial,
-une a velocidade da luz e o tempo em um único fenômeno,
-desenvolve uma translação real entre os referenciais

**Category:** Relativity and Cosmology

[258] **viXra:1705.0289 [pdf]**
*submitted on 2017-05-19 07:56:37*

**Authors:** Helmut Preininger

**Comments:** 14 Pages.

In this paper we take a closer look to the distribution of the residues of squarefree natural numbers and explain an algorithm to compute those distributions.
We also give some conjectures about the minimal number of cycles in the squarefree arithmetic progression and explain an algorithm to compute this minimal numbers.

**Category:** Number Theory

[257] **viXra:1705.0288 [pdf]**
*submitted on 2017-05-19 08:21:45*

**Authors:** George Rajna

**Comments:** 13 Pages.

"The best result on dark matter so far—and we just got started." This is how scientists behind XENON1T, now the most sensitive dark matter experiment worldwide , commented on their first result from a short 30-day run presented today to the scientific community. [13] The gravitational force attracting the matter, causing concentration of the matter in a small space and leaving much space with low matter concentration: dark matter and energy. There is an asymmetry between the mass of the electric charges, for example proton and electron, can understood by the asymmetrical Planck Distribution Law. This temperature dependent energy distribution is asymmetric around the maximum intensity, where the annihilation of matter and antimatter is a high probability event. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter. SIMPs would resolve certain discrepancies between simulations of the distribution of dark matter, like this one, and the observed properties of the galaxies. In particle physics and astrophysics, weakly interacting massive particles, or WIMPs, are among the leading hypothetical particle physics candidates for dark matter.

**Category:** High Energy Particle Physics

[256] **viXra:1705.0286 [pdf]**
*submitted on 2017-05-19 09:06:12*

**Authors:** George Rajna

**Comments:** 21 Pages.

Astronomers have constructed the first map of the universe based on the positions of supermassive black holes, which reveals the large-scale structure of the universe. [16] Astronomers want to record an image of the heart of our galaxy for the first time: a global collaboration of radio dishes is to take a detailed look at the black hole which is assumed to be located there. [15] A team of researchers from around the world is getting ready to create what might be the first image of a black hole. [14] "There seems to be a mysterious link between the amount of dark matter a galaxy holds and the size of its central black hole, even though the two operate on vastly different scales," said Akos Bogdan of the Harvard-Smithsonian Center for Astrophysics (CfA). [13] If dark matter comes in both matter and antimatter varieties, it might accumulate inside dense stars to create black holes. [12] For a long time, there were two main theories related to how our universe would end. These were the Big Freeze and the Big Crunch. In short, the Big Crunch claimed that the universe would eventually stop expanding and collapse in on itself. This collapse would result in…well…a big crunch (for lack of a better term). Think " the Big Bang " , except just the opposite. That's essentially what the Big Crunch is. On the other hand, the Big Freeze claimed that the universe would continue expanding forever, until the cosmos becomes a frozen wasteland. This theory asserts that stars will get farther and farther apart, burn out, and (since there are no more stars bring born) the universe will grown entirely cold and eternally black. [11] Newly published research reveals that dark matter is being swallowed up by dark energy, offering novel insight into the nature of dark matter and dark energy and what the future of our Universe might be. [10] The gravitational force attracting the matter, causing concentration of the matter in a small space and leaving much space with low matter concentration: dark matter and energy. There is an asymmetry between the mass of the electric charges, for example proton and electron, can understood by the asymmetrical Planck Distribution Law. This temperature dependent energy distribution is asymmetric around the maximum intensity, where the annihilation of matter and antimatter is a high probability event. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** Astrophysics

[255] **viXra:1705.0285 [pdf]**
*submitted on 2017-05-19 09:52:25*

**Authors:** George Rajna

**Comments:** 23 Pages.

The atomic nucleus offers a unique opportunity to study the competition between three of the four fundamental forces known to exist in nature, the strong nuclear interaction, the electromagnetic interaction and the weak nuclear interaction. [11] This paper explains the magnetic effect of the electric current from the observed effects of the accelerating electrons, causing naturally the experienced changes of the electric field potential along the electric wire. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the wave particle duality and the electron's spin also, building the bridge between the Classical and Quantum Theories. New ideas for interactions and particles: This paper examines also the possibility to origin the Spontaneously Broken Symmetries from the Planck Distribution Law. This way we get a Unification of the Strong, Electromagnetic, and Weak Interactions from the interference occurrences of oscillators. Understanding that the relativistic mass change is the result of the magnetic induction we arrive to the conclusion that the Gravitational Force is also based on the electromagnetic forces, getting a Unified Relativistic Quantum Theory of all 4 Interactions.

**Category:** High Energy Particle Physics

[254] **viXra:1705.0284 [pdf]**
*submitted on 2017-05-19 04:31:59*

**Authors:** George Rajna

**Comments:** 30 Pages.

Particle-free quantum communication is achieved in the lab. [18] In the non-intuitive quantum domain, the phenomenon of counterfactuality is defined as the transfer of a quantum state from one site to another without any quantum or classical particle transmitted between them. [17] The quantum internet, which connects particles linked together by the principle of quantum entanglement, is like the early days of the classical internet – no one can yet imagine what uses it could have, according to Professor Ronald Hanson, from Delft University of Technology, the Netherlands, whose team was the first to prove that the phenomenon behind it was real. [16] Through a collaboration between the University of Calgary, The City of Calgary and researchers in the United States, a group of physicists led by Wolfgang Tittel, professor in the Department of Physics and Astronomy at the University of Calgary have successfully demonstrated teleportation of a photon (an elementary particle of light) over a straight-line distance of six kilometers using The City of Calgary's fiber optic cable infrastructure. [15] Optical quantum technologies are based on the interactions of atoms and photons at the single-particle level, and so require sources of single photons that are highly indistinguishable – that is, as identical as possible. Current single-photon sources using semiconductor quantum dots inserted into photonic structures produce photons that are ultrabright but have limited indistinguishability due to charge noise, which results in a fluctuating electric field. [14] A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy's Oak Ridge National Laboratory. [13] A source of single photons that meets three important criteria for use in quantum-information systems has been unveiled in China by an international team of physicists. Based on a quantum dot, the device is an efficient source of photons that emerge as solo particles that are indistinguishable from each other. The researchers are now trying to use the source to create a quantum computer based on "boson sampling". [11] With the help of a semiconductor quantum dot, physicists at the University of Basel have developed a new type of light source that emits single photons. For the first time, the researchers have managed to create a stream of identical photons. [10] Optical photons would be ideal carriers to transfer quantum information over large distances. Researchers envisage a network where information is processed in certain nodes and transferred between them via photons. [9] While physicists are continually looking for ways to unify the theory of relativity, which describes large-scale phenomena, with quantum theory, which describes small-scale phenomena, computer scientists are searching for technologies to build the quantum computer using Quantum Information. In August 2013, the achievement of "fully deterministic" quantum teleportation, using a hybrid technique, was reported. On 29 May 2014, scientists announced a reliable way of transferring data by quantum teleportation. Quantum teleportation of data had been done before but with highly unreliable methods. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to build the Quantum Computer with the help of Quantum Information.

**Category:** Quantum Physics

[253] **viXra:1705.0282 [pdf]**
*submitted on 2017-05-19 05:39:30*

**Authors:** George Rajna

**Comments:** 21 Pages.

The transformation of a quantum monopole into a Dirac monopole has been observed for the first time by physicists at Amherst College in the US and Aalto University in Finland. [13] Scientists at Amherst College (USA) and Aalto University (Finland) have made the first experimental observations of the dynamics of isolated monopoles in quantum matter. [12] Building on his own previous research, Amherst College professor David S. Hall '91 and a team of international collaborators have experimentally identified a pointlike monopole in a quantum field for the first time. The discovery, announced this week, gives scientists further insight into the elusive monopole magnet, an elementary particle that researchers believe exists but have not yet seen in nature. [11] For the first time, physicists have achieved interference between two separate atoms: when sent towards the opposite sides of a semi-transparent mirror, the two atoms always emerge together. This type of experiment, which was carried out with photons around thirty years ago, had so far been impossible to perform with matter, due to the extreme difficulty of creating and manipulating pairs of indistinguishable atoms. [10] The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** Quantum Physics

[252] **viXra:1705.0281 [pdf]**
*submitted on 2017-05-18 13:41:39*

**Authors:** George Rajna

**Comments:** 26 Pages.

You can't see them, but swarms of electrons are buzzing through the magnetic environment—the magnetosphere—around Earth. [18] When NASA's Magnetospheric Multiscale—or MMS—mission was launched, the scientists knew it would answer questions fundamental to the nature of our universe—and MMS hasn't disappointed. [17] Magnetic reconnection, a universal process that triggers solar flares and northern lights and can disrupt cell phone service and fusion experiments, occurs much faster than theory says that it should. [16] A surprising new class of X-ray pulsating variable stars has been discovered by a team of American and Canadian astronomers led by Villanova University's Scott Engle and Edward Guinan. [15] Late last year, an international team including researchers from the Kavli Institute for Astronomy and Astrophysics (KIAA) at Peking University announced the discovery of more than 60 extremely distant quasars, nearly doubling the number known to science-and thus providing dozens of new opportunities to look deep into our universe's history. [14] Fuzzy pulsars orbiting black holes could unmask quantum gravity. [13] Cosmologists trying to understand how to unite the two pillars of modern science – quantum physics and gravity – have found a new way to make robust predictions about the effect of quantum fluctuations on primordial density waves, ripples in the fabric of space and time. [12] Physicists have performed a test designed to investigate the effects of the expansion of the universe—hoping to answer questions such as "does the expansion of the universe affect laboratory experiments?", "might this expansion change the lengths of solid objects and the time measured by atomic clocks differently, in violation of Einstein's equivalence principle?", and "does spacetime have a foam-like structure that slightly changes the speed of photons over time?", an idea that could shed light on the connection between general relativity and quantum gravity. [11] Einstein's equivalence principle states that an object in gravitational free fall is physically equivalent to an object that is accelerating with the same amount of force in the absence of gravity. This principle lies at the heart of general relativity and has been experimentally tested many times.

**Category:** Astrophysics

[251] **viXra:1705.0280 [pdf]**
*submitted on 2017-05-18 16:45:31*

**Authors:** Yanming Wei

**Comments:** 6 pages, 3 figures. DOI: 10.13140/RG.2.2.27195.62248

My recent deep researches have fruited many great discoveries and inventions: 1. thermal solar neutrinos can be focused by special heavy metal lens, and the focused neutrinos can catalyze nuclear beta decay in exponential effect. 2. it is possible to mimic superconductor by dyno-capacitor module to cheaply realize same effects but working in room temperature even higher hundreds Celsius degree. By combining above 2 catalysis technologies, we expect to build a powerful high voltage DC betavoltaic nuclear reactor by using Lutetium fuel 176Lu. Although energy density is far less than conventional fission fuel 235U, however it is very clean nuclear energy, because of non-toxic material and no harmful waste.

**Category:** Nuclear and Atomic Physics

[250] **viXra:1705.0279 [pdf]**
*submitted on 2017-05-18 16:48:24*

**Authors:** Yanming Wei

**Comments:** 4 pages. DOI: 10.13140/RG.2.2.33906.50884

With the catalysis of focused neutrinos and other special means, some of those 2β isotopes can be outstood for fuel, provided it becomes possible for 2 sequential events of concatenating β1β2 with total energy Q(β1) + Q(β2) positive balance. This research paper at least proposes molybdenum 100Mo as promising candidate.

**Category:** Nuclear and Atomic Physics

[249] **viXra:1705.0278 [pdf]**
*submitted on 2017-05-18 17:36:05*

**Authors:** Michail Zak

**Comments:** 11 Pages.

The paper proposes a scenario of origin and emerging of intelligent life in Universe based upon the mathematical discovery of a new class of dynamical systems described by ODE coupled with their Liouville equation. These systems called self-controlled since the role of actuators is played by the probability produced by the Liouville equation. Following the Madelung equation that belongs to this class, non-Newtonian and quantum-like properties such as randomness, entanglement, and probability interference typical for quantum systems have been described. At the same time, these systems expose properties of livings: decomposition into motor and mental dynamics, the capability of self-identification and self-awareness, as well as self-supervision. But the most surprising discovery is the existence of a special sub-class, in which the dynamical systems can violate the second law of thermodynamics, and that makes them different from both Newtonian and quantum physics. This sub-class should be associated with intelligent livings due to capability to move from disorder to order without external help. Based upon the mathematical discovery described above, on can assume that there are good chances that similar dynamical systems representing intelligent livings exist in real physical world. This provides a reason for a “rehabilitation “of the Maxwell demon and put it into physics of intelligent systems. Indeed, the Maxwell demon is implemented by the feedback from the Liouville equation to the original ODE while this feedback is capable to rearrange the probability distribution against the second law of thermodynamics. In addition to that, the same feedback removes the entropy paradox by explaining high order in our surrounding by “intelligent life support”. Two-steps transition: from the Newtonian physics to the linear model of Life, and from the latter to the model of Intelligent life are analyzed. The first transition is triggered by the Hadamard instability of the Newtonian physics with respect to small random disturbances in linear terms of the Liouville feedback. The second transition is triggered by instability of linear model of Life with respect to small random disturbances of non-linear terms of Liouville feedback. This transition could be implemented by such physical phenomena as shock waves or negative diffusion in probability space. Both transitions can be associated with catastrophe theory, in which sudden shifts in behavior arises from small changes in parameters of the model.

**Category:** Relativity and Cosmology

[248] **viXra:1705.0277 [pdf]**
*submitted on 2017-05-19 01:01:55*

**Authors:** Shaban A. Omondi Aura

**Comments:** 30 Pages. Preferably for journals, academies and conferences

This paper is concerned with formulation and demonstration of new versions of equations that can help us resolve problems concerning maximal gaps between consecutive prime numbers, the number of prime numbers at a given magnitude and the location of nth prime number. There is also a mathematical argument on why prime numbers as elementary identities on their own respect behave the way they do. Given that the equations have already been formulated, there are worked out examples on numbers that represent different cohorts. This paper has therefore attempted to formulate an equation that approximates the number of prime numbers at a given magnitude, from N=3 to N=〖10〗^25. Concerning the location of an nth prime number, the paper has devised a method that can help us locate a given prime number within specified bounds. Nonetheless, the paper has formulated an equation that can help us determine extremely bounded gaps. Lastly, using trans-algebraic number theory method, the paper has shown that unpredictable behaviors of prime numbers are due to their identity nature.

**Category:** Number Theory

[247] **viXra:1705.0276 [pdf]**
*submitted on 2017-05-19 01:41:04*

**Authors:** Gavin R. Putland

**Comments:** 41 pages (main text: 37 pages).

A time-variation in magnetic flux density *B* may occur because the field *changes* and/or because the field *moves* relative to the observation point. Faraday's law for a fixed circuit makes no distinction between these causes. But the latter cause is isolated by the magnetic term in the Lorentz force law, which, in a reference frame fixed with respect to the particle, implies that a field *B* moving at velocity *r* induces an electric field *E* = −*r* × *B*. In the case of a traveling electromagnetic wave, *r* is the *ray* velocity (hence the symbol).

Similarly, a time-variation in the electric displacement field * D* may occur because the field changes and/or because the field moves. The Maxwell-Ampère law makes no distinction between these causes. But, by analogy with the Lorentz force law, the latter cause can be isolated by saying that a *D* field moving at velocity *r* induces a magnetizing field *H* = *r* × *D*.

The two "moving field" laws, combined with the relations between *D* and *E* and between *B* and *H*, yield an unusually simple theory of electromagnetic waves, including a derivation of Fresnel's equation for the ray-velocity surface of a non-chiral birefringent crystal. Taking cross-products of the "moving field" laws with the wave-slowness vector, we obtain two more "moving field" equations in terms of wave slowness (generalizing the conventional formulation in terms of the wave *vector*). The last two equations, by analogy with the first two, yield Hamilton's wave-slowness surface. Comparing the results, we can conclude that the ray-velocity and wave-slowness surfaces of a biaxial crystal have curves of contact with tangent planes, and deduce the associated polarizations. Eigenvectors are introduced to show that, in general, the permitted polarizations for a given propagation direction are orthogonal. A coordinate transformation (simpler than Hamilton's) shows that the curves of contact are circles and yields their linear and angular diameters.

Among the footnotes are interpretations of the Poynting vector and the Minkowski momentum density. The text includes introductory material intended to make it comprehensible to high-school graduates.

(P.S.: In this abstract, vectors are shown in italics because boldface is not permitted.)

**Category:** Classical Physics

[246] **viXra:1705.0275 [pdf]**
*submitted on 2017-05-18 13:29:52*

**Authors:** Domenico Oricchio

**Comments:** 1 Page.

I try to obtain the theoretical minimum for the energy transfer orbit for a spaceship

**Category:** Astrophysics

[245] **viXra:1705.0274 [pdf]**
*replaced on 2017-12-28 08:38:31*

**Authors:** Alexandre Harvey-Tremblay

**Comments:** 75 Pages.

We propose a meta-logical framework to understand the world by an ensemble of theorems rather than by a set of axioms. We prove that the theorems of the ensemble must have *feasible* proofs and must recover *universality*. The ensemble is axiomatized when it is constructed as a partition function, in which case its axioms are, up to an error rate, the leading bits of Omega (the halting probability of a prefix-free universal Turing machine). The partition function augments the standard construction of Omega with knowledge of the size of the proof of each theorems. With this knowledge, it is able to decide *feasible mathematics*.
As a consequence of the axiomatization, the ensemble additionally adopts the mathematical structure of an ensemble of statistical physics; it is from this context that the laws of physics are derived. The Lagrange multipliers of the partition function are the fundamental Planck units and the background, a thermal space-time, emerges as a consequence of the limits applicable to the conjugate pairs. The background obeys the relations of special and general relativity, dark energy, the arrow of time, the Schrödinger equation, the Dirac equation and it embeds the holographic principle. In this context, the limits of feasible mathematics are mathematically the same as the laws of physics.
The framework is so fundamental that informational equivalents to length, time and mass (assumed as axioms in most physical theories) are here formally derivable. Furthermore, it can prove that no alternative framework can contain fewer bits of axioms than it contains (thus it is necessarily the simplest theory). Furthermore, it can prove that, for all worlds amenable to this framework, the laws of physics will be the same (hence there can be no alternatives).
Thus, the framework is a possible candidate for a final theory.

**Category:** Thermodynamics and Energy

[244] **viXra:1705.0273 [pdf]**
*submitted on 2017-05-18 10:06:56*

**Authors:** George Rajna

**Comments:** 31 Pages.

Google's computer programs are gaining a better understanding of the world, and now it wants them to handle more of the decision-making for the billions of people who use its services. [18] Microsoft on Wednesday unveiled new tools intended to democratize artificial intelligence by enabling machine smarts to be built into software from smartphone games to factory floors. [17] The closer we can get a machine translation to be on par with expert human translation, the happier lots of people struggling with translations will be. [16] Researchers have created a large, open source database to support the development of robot activities based on natural language input. [15] A pair of physicists with ETH Zurich has developed a way to use an artificial neural network to characterize the wave function of a quantum many-body system. [14] A team of researchers at Google's DeepMind Technologies has been working on a means to increase the capabilities of computers by combining aspects of data processing and artificial intelligence and have come up with what they are calling a differentiable neural computer (DNC.) In their paper published in the journal Nature, they describe the work they are doing and where they believe it is headed. To make the work more accessible to the public team members, Alexander Graves and Greg Wayne have posted an explanatory page on the DeepMind website. [13] Nobody understands why deep neural networks are so good at solving complex problems. Now physicists say the secret is buried in the laws of physics. [12] A team of researchers working at the University of California (and one from Stony Brook University) has for the first time created a neural-network chip that was built using just memristors. In their paper published in the journal Nature, the team describes how they built their chip and what capabilities it has. [11] A team of researchers used a promising new material to build more functional memristors, bringing us closer to brain-like computing. Both academic and industrial laboratories are working to develop computers that operate more like the human brain. Instead of operating like a conventional, digital system, these new devices could potentially function more like a network of neurons. [10]

**Category:** Artificial Intelligence

[243] **viXra:1705.0272 [pdf]**
*submitted on 2017-05-18 10:44:33*

**Authors:** George Rajna

**Comments:** 21 Pages.

Weakly-interacting sparticles are produced at lower rates and lead to less striking signatures, making them more difficult to distinguish from Standard Model background processes. [18] Supersymmetry (SUSY) is one of the most attractive theories extending the Standard Model of particle physics. [17] If researchers at Florida Institute of Technology, employing pioneering new methods, are able to determine the top quark's mass at a level of precision as yet unachieved, they will move science closer to understanding whether the universe is stable, as we have long believed to be the case, or unstable. [16] Last February, scientists made the groundbreaking discovery of gravitational waves produced by two colliding black holes. Now researchers are expecting to detect similar gravitational wave signals in the near future from collisions involving neutron stars—for example, the merging of two neutron stars to form a black hole, or the merging of a neutron star and a black hole. [15] In a new study published in EPJ A, Susanna Liebig from Forschungszentrum Jülich, Germany, and colleagues propose a new approach to nuclear structure calculations. The results are freely available to the nuclear physicists' community so that other groups can perform their own nuclear structure calculations, even if they have only limited computational resources. [14] The PHENIX detector at the Relativistic Heavy Ion Collider (RHIC), a particle accelerator at Brookhaven National Laboratory uniquely capable of measuring how a proton's internal building blocks — quarks and gluons — contribute to its overall intrinsic angular momentum, or "spin." [13] More realistic versions of lattice QCD may lead to a better understanding of how quarks formed hadrons in the early Universe. The resolution of the Proton Radius Puzzle is the diffraction pattern, giving another wavelength in case of muonic hydrogen oscillation for the proton than it is in case of normal hydrogen because of the different mass rate. Taking into account the Planck Distribution Law of the electromagnetic oscillators, we can explain the electron/proton mass rate and the Weak and Strong Interactions. Lattice QCD gives the same results as the diffraction patterns of the electromagnetic oscillators, explaining the color confinement and the asymptotic freedom of the Strong Interactions.

**Category:** High Energy Particle Physics

[242] **viXra:1705.0271 [pdf]**
*replaced on 2017-07-29 23:00:25*

**Authors:** Frank Dodd Tony Smith Jr

**Comments:** 36 Pages.

Consider three cases: First Case (pages 2-4): Does E8 represent Realistic Standard Model plus Gravity ? Consensus = NO Individual = YES Second Case (pages 5-29): Our Universe: Is it Stable ? Consensus = NO (only metastable) Individual = YES Third Case ( pages 30-36 ): Dark Energy and Dark Matter Consensus = Unknown Individual = Segal Conformal Structure This paper is a brief description of interactions between Consensus and Individual in each of those cases, where: Consensus = the Physics Establishment including: Organizers of 2010 Banff Workshop on Structure and Representations of Exceptional Groups (page 3-4); Moriond 2017 (page 4); the Princeton Institute for Advanced Study (page 4); and the Simons Center for Geometry and Physics (page 4); Fermilab, CDF, and D0 Collaborations (pages 9-17); the Cornell arXiv (pages 16; 30-31); CERN CDS (pages 17; 31); LHC, ATLAS, and CMS Collaborations (pages 18-29) and Individual = I, a Georgia lawyer with a 1963 AB in math from Princeton and some physics study at Georgia Tech with David Finkelstein as adviser, but, having at age 50 failed the Fall 1991 Georgia Tech Comprehensive Exam ( a 3-day closed book exam ), I have no physics degree. Version 2 (v2) adds correct viXra number and some details about Fermilab data. Version 3 (v3) adds the First Case, more details, and gives Thanks to ATLAS for ATLAS-CONF-2017-058 stating existence of a possible 240 GeV Higgs Mass State at 3.6 sigma local significance.

**Category:** High Energy Particle Physics

[241] **viXra:1705.0270 [pdf]**
*submitted on 2017-05-18 06:20:19*

**Authors:** George Rajna

**Comments:** 28 Pages.

In the race to produce a quantum computer, a number of projects are seeking a way to create quantum bits—or qubits—that are stable, meaning they are not much affected by changes in their environment. [18] The global race towards a functioning quantum computer is on. With future quantum computers, we will be able to solve previously impossible problems and develop, for example, complex medicines, fertilizers, or artificial intelligence. [17] The Tohoku University research group of Professor Keiichi Edamatsu and Postdoctoral fellow Naofumi Abe has demonstrated dynamically and statically unpolarized single-photon generation using diamond. [16] Through a collaboration between the University of Calgary, The City of Calgary and researchers in the United States, a group of physicists led by Wolfgang Tittel, professor in the Department of Physics and Astronomy at the University of Calgary have successfully demonstrated teleportation of a photon (an elementary particle of light) over a straight-line distance of six kilometers using The City of Calgary's fiber optic cable infrastructure. [15] Optical quantum technologies are based on the interactions of atoms and photons at the single-particle level, and so require sources of single photons that are highly indistinguishable – that is, as identical as possible. Current single-photon sources using semiconductor quantum dots inserted into photonic structures produce photons that are ultrabright but have limited indistinguishability due to charge noise, which results in a fluctuating electric field. [14] A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy's Oak Ridge National Laboratory. [13] A source of single photons that meets three important criteria for use in quantum-information systems has been unveiled in China by an international team of physicists. Based on a quantum dot, the device is an efficient source of photons that emerge as solo particles that are indistinguishable from each other. The researchers are now trying to use the source to create a quantum computer based on "boson sampling". [11] With the help of a semiconductor quantum dot, physicists at the University of Basel have developed a new type of light source that emits single photons. For the first time, the researchers have managed to create a stream of identical photons. [10] Optical photons would be ideal carriers to transfer quantum information over large distances. Researchers envisage a network where information is processed in certain nodes and transferred between them via photons. [9] While physicists are continually looking for ways to unify the theory of relativity, which describes large-scale phenomena, with quantum theory, which describes small-scale phenomena, computer scientists are searching for technologies to build the quantum computer using Quantum Information. In August 2013, the achievement of "fully deterministic" quantum teleportation, using a hybrid technique, was reported. On 29 May 2014, scientists announced a reliable way of transferring data by quantum teleportation. Quantum teleportation of data had been done before but with highly unreliable methods. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to build the Quantum Computer with the help of Quantum Information.

**Category:** Quantum Physics

[240] **viXra:1705.0269 [pdf]**
*submitted on 2017-05-18 06:40:53*

**Authors:** J.A.J. van Leunen

**Comments:** 2 Pages.

This document introduces the Wikiversity Hilbert Book Model Project and describes its current state.

**Category:** Quantum Physics

[239] **viXra:1705.0268 [pdf]**
*submitted on 2017-05-17 13:12:14*

**Authors:** George Rajna

**Comments:** 20 Pages.

Scientists at Amherst College (USA) and Aalto University (Finland) have made the first experimental observations of the dynamics of isolated monopoles in quantum matter. [12] Building on his own previous research, Amherst College professor David S. Hall '91 and a team of international collaborators have experimentally identified a pointlike monopole in a quantum field for the first time. The discovery, announced this week, gives scientists further insight into the elusive monopole magnet, an elementary particle that researchers believe exists but have not yet seen in nature. [11] For the first time, physicists have achieved interference between two separate atoms: when sent towards the opposite sides of a semi-transparent mirror, the two atoms always emerge together. This type of experiment, which was carried out with photons around thirty years ago, had so far been impossible to perform with matter, due to the extreme difficulty of creating and manipulating pairs of indistinguishable atoms. [10] The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** Quantum Physics

[238] **viXra:1705.0267 [pdf]**
*submitted on 2017-05-17 13:30:31*

**Authors:** George Rajna

**Comments:** 22 Pages.

Quantum field theories are often hard to verify in experiments. Now, there is a new way of putting them to the test. [13] Scientists at Amherst College (USA) and Aalto University (Finland) have made the first experimental observations of the dynamics of isolated monopoles in quantum matter. [12] Building on his own previous research, Amherst College professor David S. Hall '91 and a team of international collaborators have experimentally identified a pointlike monopole in a quantum field for the first time. The discovery, announced this week, gives scientists further insight into the elusive monopole magnet, an elementary particle that researchers believe exists but have not yet seen in nature. [11] For the first time, physicists have achieved interference between two separate atoms: when sent towards the opposite sides of a semi-transparent mirror, the two atoms always emerge together. This type of experiment, which was carried out with photons around thirty years ago, had so far been impossible to perform with matter, due to the extreme difficulty of creating and manipulating pairs of indistinguishable atoms. [10] The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** Quantum Physics

[237] **viXra:1705.0266 [pdf]**
*submitted on 2017-05-17 14:58:18*

**Authors:** Yanming Wei

**Comments:** 7 pages, 2 figures. DOI: 10.13140/RG.2.2.26828.62084

Many country’s standards management departments have struggled for long time to accurately calibrate the halflife of free neutrons with different methods, unfortunately they are all obsessed by the mysterious unexplainable discrepancy: in-beam method longer than bottle method by 1%, so as to question whether there is undiscovered new physics therein. In this paper, I assert that nothing is new and the puzzle can be explained by the so-defined spontaneous Bosonization effect acting on dense colonized neutrons. At last, some inspired researches and possible applications are presented.

**Category:** High Energy Particle Physics

[236] **viXra:1705.0264 [pdf]**
*submitted on 2017-05-18 01:47:51*

**Authors:** Syed Afsar Abbas

**Comments:** 9 Pages.

A spin angular momentum state with a polarization orientation in any ar-
bitrary direction can be constructed as a spinor in the SU(2)-spin space as
χ = a| ↑> +b| ↓>. However the corresponding isospinor in the SU(2)-isospin
space, ψ = a|p > +b|n > is discarded on empirical grounds. Still, we do not
have any sound theoretcal understanding of this phenomenon. Here we provide
a consistent explanation of this effect.

**Category:** High Energy Particle Physics

[235] **viXra:1705.0262 [pdf]**
*replaced on 2017-10-24 07:14:41*

**Authors:** D.K.K. Adjaï, L. H. Koudahoun, J. Akande, Y.J.F. Kpomahou, M. D. Monsia

**Comments:** 12 pages

This paper shows that explicit and exact general periodic solutions for various types of Lienard
equations can be computed by applying the generalized Sundman transformation. As an il-
lustration of the efficiency of the proposed theory, the cubic Duffing equation and Painleve-
Gambier equations were considered. As a major result, it has been found, for the first time, that equation XII of the Painleve-Gambier classication can exhibit, according to an appropriate parametric choice, trigonometric solutions, but with a shift factor.

**Category:** Mathematical Physics

[234] **viXra:1705.0261 [pdf]**
*submitted on 2017-05-18 03:29:03*

**Authors:** Zhi Cheng

**Comments:** 14 Pages. Include Chinese version

In this paper, we propose a concept of vector complex function to prove that the whole world can be reduced to a very simple function f(Z) = F + iG by introducing the knowledge of complex function theories. We can also derive Maxwell equations through the differential and integral analysis of the vector complex function.

**Category:** Mathematical Physics

[233] **viXra:1705.0260 [pdf]**
*replaced on 2018-03-28 12:46:32*

**Authors:** Yongfeng Yang

**Comments:** 44 Pages.

Plate motion was widely thought to be a manifestation of mantle dynamics. However, an in-depth investigation shows this understanding incompetent. Here we propose, the tide-related oceans yield varying pressures between them, the application of these pressures to the continent's sides forms enormously unequal horizontal forces (i.e., the ocean-generating forces), the net effect of these forces provides lateral push to the continent and may cause it to move horizontally, further, the travelling continent works its adjacent crusts to move, these totally form plate motion. A roughly estimation shows that the ocean-generating force may give South American, African, Indian, and Australian continents a movement of respectively 2.8, 4.2, 5.7, and 6.3 cm/yr, and give Pacific Plate a movement of 8.9 cm/yr. Some torque effects of the ocean-generating force contributes to rotate North American and Eurasian continents.

**Category:** Geophysics

[232] **viXra:1705.0259 [pdf]**
*replaced on 2017-11-03 17:35:05*

**Authors:** Tamas Lajtner

**Comments:** 13 Pages.

The de Broglie wavelength describes wave-particle duality. The de Broglie wavelength formula and the Planck law seem to be contradicted in tunneling. Tunneling fast waves have longer wavelengths than "normal" waves. According to the de Broglie formula, a longer wavelength means smaller momentum (smaller energy). But fast waves have the same amount of energy as normal waves, since they can be transformed into each other.
This longer wavelength is not based on the refractive index of the barrier. The barrier in tunneling cannot be seen as an optical medium, rather a special kind of space made out of matter that other matter is able to use as space. Here we show that the 'rest actions', 'rest energies' of fast waves in different spaces can resolve the contradiction. This 'rest action' of the wave is a new concept that hasn't been considered. It is hidden in the Planck constant. In uncovering this part, we find that the Planck constant has two parts; one part shows the 'rest action', 'rest energy' of fast wave and another part shows the 'kinetic action', 'kinetic energy' of fast waves. The Planck constant seems to have a more general role than we have previously thought.
Fast waves are made out of normal waves (or particles). Fast wave is the same particle in a different form. The Fast Wave–Wave–Particle Triality describes a new kind of metamorphosis of matter— how tunneling electrons travel faster than light without violating special relativity. Using the Fast Wave–Wave–Particle Triality, we can realize that the speed of light is not a speed limit for particles with mass, since they can be transformed into fast waves. The Fast Wave–Wave–Particle Triality shows the end of scope of the special theory of relativity, and opens a new worldview.

**Category:** Quantum Physics

[231] **viXra:1705.0258 [pdf]**
*submitted on 2017-05-17 07:36:10*

**Authors:** Miguel A. Sanchez-Rey

**Comments:** 2 Pages.

Access to metaspace and the metamorphic ratio.

**Category:** High Energy Particle Physics

[230] **viXra:1705.0257 [pdf]**
*submitted on 2017-05-17 07:42:21*

**Authors:** George Rajna

**Comments:** 16 Pages.

Physicist Professor Chunnong Zhao and his recent PhD students Haixing Miao and Yiqiu Ma are members of an international team that has created a particularly exciting new design for gravitational wave detectors. [9] A proposal for a gravitational-wave detector made of two space-based atomic clocks has been unveiled by physicists in the US. [8] The gravitational waves were detected by both of the twin Laser Interferometer Gravitational-Wave Observatory (LIGO) detectors, located in Livingston, Louisiana, and Hanford, Washington, USA. [7] A team of researchers with the University of Lisbon has created simulations that indicate that the gravitational waves detected by researchers with the LIGO project, and which are believed to have come about due to two black holes colliding, could just have easily come from another object such as a gravaster (objects which are believed to have their insides made of dark energy) or even a wormhole. In their paper published in Physical Review Letters, the team describes the simulations they created, what was seen and what they are hoping to find in the future. [6] In a landmark discovery for physics and astronomy, international scientists said Thursday they have glimpsed the first direct evidence of gravitational waves, or ripples in space-time, which Albert Einstein predicted a century ago. [5] Scientists at the National Institute for Space Research in Brazil say an undiscovered type of matter could be found in neutron stars (illustration shown). Here matter is so dense that it could be 'squashed' into strange matter. This would create an entire 'strange star'-unlike anything we have seen. [4] The changing acceleration of the electrons explains the created negative electric field of the magnetic induction, the electromagnetic inertia, the changing relativistic mass and the Gravitational Force, giving a Unified Theory of the physical forces. Taking into account the Planck Distribution Law of the electromagnetic oscillators also, we can explain the electron/proton mass rate and the Weak and Strong Interactions.

**Category:** Astrophysics

[229] **viXra:1705.0256 [pdf]**
*submitted on 2017-05-17 08:12:52*

**Authors:** George Rajna

**Comments:** 42 Pages.

Two teams working independently have conducted studies with similar results suggesting the possibility that some of the cosmic rays striking the Earth arise from dark matter particles colliding with one another. [29] A mysterious gamma-ray glow at the center of the Milky Way is most likely caused by pulsars – the incredibly dense, rapidly spinning cores of collapsed ancient stars that were up to 30 times more massive than the sun. [28] Further evidence of the existence of dark matter – the mysterious substance that is believed to hold the Universe together – has been produced by Cosmologists at Durham University. [27] Researchers at the University of Waterloo have been able to capture the first composite image of a dark matter bridge that connects galaxies together. [26] In an abandoned gold mine one mile beneath Lead, South Dakota, the cosmos quiets down enough to potentially hear the faint whispers of the universe's most elusive material—dark matter. [25] The PICO bubble chambers use temperature and sound to tune into dark matter particles. [24] A detection device designed and built at Yale is narrowing the search for dark matter in the form of axions, a theorized subatomic particle that may make up as much as 80% of the matter in the universe. [23] The race is on to build the most sensitive U.S.-based experiment designed to directly detect dark matter particles. Department of Energy officials have formally approved a key construction milestone that will propel the project toward its April 2020 goal for completion. [22] Scientists at the Center for Axion and Precision Physics Research (CAPP), within the Institute for Basic Science (IBS) have optimized some of the characteristics of a magnet to hunt for one possible component of dark matter called axion. [21] The first sighting of clustered dwarf galaxies bolsters a leading theory about how big galaxies such as our Milky Way are formed, and how dark matter binds them, researchers said Monday. [20]

**Category:** Astrophysics

[228] **viXra:1705.0255 [pdf]**
*submitted on 2017-05-17 08:56:39*

**Authors:** George Rajna

**Comments:** 20 Pages.

Supersymmetry (SUSY) is one of the most attractive theories extending the Standard Model of particle physics. [17]
If researchers at Florida Institute of Technology, employing pioneering new methods, are able to determine the top quark's mass at a level of precision as yet unachieved, they will move science closer to understanding whether the universe is stable, as we have long believed to be the case, or unstable. [16]
Last February, scientists made the groundbreaking discovery of gravitational waves produced by two colliding black holes. Now researchers are expecting to detect similar gravitational wave signals in the near future from collisions involving neutron stars—for example, the merging of two neutron stars to form a black hole, or the merging of a neutron star and a black hole. [15]
In a new study published in EPJ A, Susanna Liebig from Forschungszentrum Jülich, Germany, and colleagues propose a new approach to nuclear structure calculations. The results are freely available to the nuclear physicists' community so that other groups can perform their own nuclear structure calculations, even if they have only limited computational resources. [14]
The PHENIX detector at the Relativistic Heavy Ion Collider (RHIC), a particle accelerator at Brookhaven National Laboratory uniquely capable of measuring how a proton's internal building blocks — quarks and gluons — contribute to its overall intrinsic angular momentum, or "spin." [13]
More realistic versions of lattice QCD may lead to a better understanding of how quarks formed hadrons in the early Universe.
The resolution of the Proton Radius Puzzle is the diffraction pattern, giving another wavelength in case of muonic hydrogen oscillation for the proton than it is in case of normal hydrogen because of the different mass rate.
Taking into account the Planck Distribution Law of the electromagnetic oscillators, we can explain the electron/proton mass rate and the Weak and Strong Interactions. Lattice QCD gives the same results as the diffraction patterns of the electromagnetic oscillators, explaining the color confinement and the asymptotic freedom of the Strong Interactions.

**Category:** High Energy Particle Physics

[227] **viXra:1705.0254 [pdf]**
*submitted on 2017-05-17 06:34:34*

**Authors:** Yibing Qiu

**Comments:** 1 Page.

Abstract: showing a viewpoint with regards to the image.

**Category:** Astrophysics

[226] **viXra:1705.0253 [pdf]**
*submitted on 2017-05-16 13:34:53*

**Authors:** George Rajna

**Comments:** 23 Pages.

Stars, quasars, and other celestial objects generate photons in a random way, and now scientists have taken advantage of this randomness to generate random numbers at rates of more than one million numbers per second. [18] UBC physicists may have solved one of nature's great puzzles: what causes the accelerating expansion of our universe? [17] A team of scientists at the Tata Institute of Fundamental Research (TIFR), Mumbai, India, have found new ways to detect a bare or naked singularity, the most extreme object in the universe. [16] New data from NASA's Chandra X-ray Observatory and other telescopes has revealed details about this giant black hole, located some 145 million light years from Earth. [15] A team of researchers from around the world is getting ready to create what might be the first image of a black hole. [14] "There seems to be a mysterious link between the amount of dark matter a galaxy holds and the size of its central black hole, even though the two operate on vastly different scales," said Akos Bogdan of the Harvard-Smithsonian Center for Astrophysics (CfA). [13] If dark matter comes in both matter and antimatter varieties, it might accumulate inside dense stars to create black holes. [12] For a long time, there were two main theories related to how our universe would end. These were the Big Freeze and the Big Crunch. In short, the Big Crunch claimed that the universe would eventually stop expanding and collapse in on itself. This collapse would result in…well…a big crunch (for lack of a better term). Think " the Big Bang " , except just the opposite. That's essentially what the Big Crunch is. On the other hand, the Big Freeze claimed that the universe would continue expanding forever, until the cosmos becomes a frozen wasteland. This theory asserts that stars will get farther and farther apart, burn out, and (since there are no more stars bring born) the universe will grown entirely cold and eternally black. [11] Newly published research reveals that dark matter is being swallowed up by dark energy, offering novel insight into the nature of dark matter and dark energy and what the future of our Universe might be. [10] The gravitational force attracting the matter, causing concentration of the matter in a small space and leaving much space with low matter concentration: dark matter and energy. There is an asymmetry between the mass of the electric charges, for example proton and electron, can understood by the asymmetrical Planck Distribution Law. This temperature dependent energy distribution is asymmetric around the maximum intensity, where the annihilation of matter and antimatter is a high probability event. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** Astrophysics

[225] **viXra:1705.0252 [pdf]**
*submitted on 2017-05-16 13:56:59*

**Authors:** George Rajna

**Comments:** 26 Pages.

Energy dissipation is a key ingredient in understanding many physical phenomena in thermodynamics, photonics, chemical reactions, nuclear fission, photon emissions, or even electronic circuits, among others. [15] The likelihood of seeing quantum systems violating the second law of thermodynamics has been calculated by UCL scientists. [14] For more than a century and a half of physics, the Second Law of Thermodynamics, which states that entropy always increases, has been as close to inviolable as any law we know. In this universe, chaos reigns supreme. [13] Physicists have shown that the three main types of engines (four-stroke, two-stroke, and continuous) are thermodynamically equivalent in a certain quantum regime, but not at the classical level. [12] For the first time, physicists have performed an experiment confirming that thermodynamic processes are irreversible in a quantum system—meaning that, even on the quantum level, you can't put a broken egg back into its shell. The results have implications for understanding thermodynamics in quantum systems and, in turn, designing quantum computers and other quantum information technologies. [11] Disorder, or entropy, in a microscopic quantum system has been measured by an international group of physicists. The team hopes that the feat will shed light on the "arrow of time": the observation that time always marches towards the future. The experiment involved continually flipping the spin of carbon atoms with an oscillating magnetic field and links the emergence of the arrow of time to quantum fluctuations between one atomic spin state and another. [10] Mark M. Wilde, Assistant Professor at Louisiana State University, has improved this theorem in a way that allows for understanding how quantum measurements can be approximately reversed under certain circumstances. The new results allow for understanding how quantum information that has been lost during a measurement can be nearly recovered, which has potential implications for a variety of quantum technologies. [9] Today, we are capable of measuring the position of an object with unprecedented accuracy, but quantum physics and the Heisenberg uncertainty principle place fundamental limits on our ability to measure. Noise that arises as a result of the quantum nature of the fields used to make those measurements imposes what is called the "standard quantum limit." This same limit influences both the ultrasensitive measurements in nanoscale devices and the kilometer-scale gravitational wave detector at LIGO. Because of this troublesome background noise, we can never know an object's exact location, but a recent study provides a solution for rerouting some of that noise away from the measurement. [8] The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory.

**Category:** Thermodynamics and Energy

[224] **viXra:1705.0251 [pdf]**
*submitted on 2017-05-16 19:11:11*

**Authors:** Victor Christianto

**Comments:** 39 Pages. This paper has not been submitted to a journal

The present book consists of 6 papers that I and some colleagues developed throughout the last 3-4 years. The subjects discussed cover wireless energy transmission, soliton model of DNA, cosmology, and also solutions of Navier-Stokes equations both in 2D and 3D.
Some additional graphical plots for solution of 3D Navier-Stokes equations are also given. Hopefully the readers will find these papers at least interesting to ponder.

**Category:** Mathematical Physics

[223] **viXra:1705.0250 [pdf]**
*submitted on 2017-05-17 02:27:13*

**Authors:** Antonio Puccini

**Comments:** 4 Pages.

The discovery of the Higgs boson (HB) has revealed a highly massive particle, the value of which lies between 125 and 126.5 GeV/c2. Bearing in mind the basic concepts of Quantum Field Theory, and in full compliance with the Heisemberg Uncertainy Principle, we were able to calculate the maximum limit of the HB’s range: in perfect agreement with its high mass, it presents a value really very small, of slightly less than 10-15[cm], namely 9.8828 ∙ 10-16[cm].

**Category:** Quantum Physics

[222] **viXra:1705.0249 [pdf]**
*submitted on 2017-05-16 08:26:20*

**Authors:** Andrej Liptaj

**Comments:** 6 Pages.

A set of functions which allows easy derivative-matching is proposed. Several examples of approximations are shown.

**Category:** Functions and Analysis

[221] **viXra:1705.0248 [pdf]**
*submitted on 2017-05-16 09:00:51*

**Authors:** George Rajna

**Comments:** 25 Pages.

Precision measurement on heavy ions contradicts theory of interaction between atomic nucleus and electron. [15] For the first time, scientists have succeeded in studying the strength of hydrogen bonds in a single molecule using an atomic force microscope. [14] International team solves mystery of colloidal chains. [13] An international team of researchers have found evidence of a mysterious new state of matter, first predicted 40 years ago, in a real material. This state, known as a quantum spin liquid, causes electrons-thought to be indivisible building blocks of nature-to break into pieces. [12] In a single particle system, the behavior of the particle is well understood by solving the Schrödinger equation. Here the particle possesses wave nature characterized by the de Broglie wave length. In a many particle system, on the other hand, the particles interact each other in a quantum mechanical way and behave as if they are "liquid". This is called quantum liquid whose properties are very different from that of the single particle case. [11] Quantum coherence and quantum entanglement are two landmark features of quantum physics, and now physicists have demonstrated that the two phenomena are "operationally equivalent"—that is, equivalent for all practical purposes, though still conceptually distinct. This finding allows physicists to apply decades of research on entanglement to the more fundamental but less-well-researched concept of coherence, offering the possibility of advancing a wide range of quantum technologies. [10] The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** Quantum Physics

[220] **viXra:1705.0247 [pdf]**
*submitted on 2017-05-15 13:53:47*

**Authors:** George Rajna

**Comments:** 21 Pages.

UBC physicists may have solved one of nature's great puzzles: what causes the accelerating expansion of our universe? [17]
A team of scientists at the Tata Institute of Fundamental Research (TIFR), Mumbai, India, have found new ways to detect a bare or naked singularity, the most extreme object in the universe. [16]
New data from NASA's Chandra X-ray Observatory and other telescopes has revealed details about this giant black hole, located some 145 million light years from Earth. [15]
A team of researchers from around the world is getting ready to create what might be the first image of a black hole. [14]
"There seems to be a mysterious link between the amount of dark matter a galaxy holds and the size of its central black hole, even though the two operate on vastly different scales," said Akos Bogdan of the Harvard-Smithsonian Center for Astrophysics (CfA). [13]
If dark matter comes in both matter and antimatter varieties, it might accumulate inside dense stars to create black holes. [12]
For a long time, there were two main theories related to how our universe would end. These were the Big Freeze and the Big Crunch. In short, the Big Crunch claimed that the universe would eventually stop expanding and collapse in on itself. This collapse would result in…well…a big crunch (for lack of a better term). Think “the Big Bang”, except just the opposite. That’s essentially what the Big Crunch is. On the other hand, the Big Freeze claimed that the universe would continue expanding forever, until the cosmos becomes a frozen wasteland. This theory asserts that stars will get farther and farther apart, burn out, and (since there are no more stars bring born) the universe will grown entirely cold and eternally black. [11]
Newly published research reveals that dark matter is being swallowed up by dark energy, offering novel insight into the nature of dark matter and dark energy and what the future of our Universe might be. [10]
The gravitational force attracting the matter, causing concentration of the matter in a small space and leaving much space with low matter concentration: dark matter and energy.
There is an asymmetry between the mass of the electric charges, for example proton and electron, can understood by the asymmetrical Planck Distribution Law. This temperature dependent energy distribution is asymmetric around the maximum intensity, where the annihilation of matter and antimatter is a high probability event. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** Astrophysics

[219] **viXra:1705.0246 [pdf]**
*submitted on 2017-05-15 14:14:41*

**Authors:** L.saidani

**Comments:** 8 Pages.

The nokton theory is an attempt to construct a theory adapted to every physical phenomenon. Space and time have been discretized. Its laws are iterative and precise. Probability plays an important role here.
At first I defined the notion of image function and its mathematical framework. The notion of nokton and its state are the basis of several definitions. I later defined the canonical image function and the canonical contribution.
Two constants have been necessary to define the dynamics of this theory. With its combinatorial complexity, the theory has at present given no result which seems to me interesting. The document is only a foundation.
Among the merits of this theory the absence of the infinites and its interpretation that is contrary to the quantum mechanics or the general relativity does not strike the common sense of the physicist.

**Category:** Quantum Physics

[218] **viXra:1705.0245 [pdf]**
*submitted on 2017-05-15 16:37:35*

**Authors:** Terubumi Honjou

**Comments:** 4 Pages.

A difference in orbits has formed by a gravity action of the photon crowd of sunlight who keeps applying an investigation satellite according to a pulsation hypothesis.
In that pioneer Mary, the orbit calculation of the space exploration satellite pioneer is led by gravitation action by the sun mass. The gravitation by the light of the sun is not included in orbit calculation there. A mystery pulled little by little in the direction of the sun is not untied every year for 30 years by an effort equivalent to gravitational 1/100 hundred million from the orbit which a pioneer calculated. According to the analogy from an elementary particle pulsation principle, I think that a group of photons of the light of the sun to arrive in the exploration satellite is the gravitation by a gravity child carried by a material wave from the sun. It is vacuum space between a space probe and the sun, and there is not the atmosphere inhibiting a light of the sun ingredient. It should always go away by radiation pressure (photon rocket) of the light of the sun from the sun because photon group of enormous numerical high energy collides with a space probe. On the contrary, the top is past repulsive force of the radiation pressure, and, as for what is drawn into the sun direction, gravity by the photon group can understand that it is bigger and acts. A mystery to deviate from the original track that I calculated by gravity by the mass of the sun.

**Category:** Astrophysics

[217] **viXra:1705.0244 [pdf]**
*submitted on 2017-05-15 17:48:18*

**Authors:** Fang Zhou

**Comments:** 13 pages in Chinese

The Galilean Transformation is, strictly speaking, applicable only in assumed case of unlimited light velocity. The Galilean Transformation describes the composition of observers’ observation vectors. The author of the article, utilizing this inherent property of Galilean Transformation, presents the simplest deduction for space-time transformation objectively existing in motion observation under the circumstances of limited light velocity, and revealed the ‘Law of Equi-Status for Reference Systems’ as well.

**Category:** Relativity and Cosmology

[216] **viXra:1705.0243 [pdf]**
*replaced on 2017-05-18 16:56:26*

**Authors:** Yanming Wei

**Comments:** 15 pages, 3 figures. DOI: 10.13140/RG.2.2.15595.75045

This paper emphasizes how great energy hidden in ubiquitous aerial water vapor and how spectacular and subtle in natural evaporation by visualizing tedious thermodynamic data in vivid macroscopic and microscopic scale with different gauges such as kj/kg, eV/molecule, photonic wavelength per single step of water molecular clusterization during condensation for energy density estimation, mm/day, nm/s for evaporation rate average calculation. Condensation is first time described as special invisible infrared combustion, and it is proved that it is theoretically possible to convert its latent heat to high grade thermal energy.

**Category:** Thermodynamics and Energy

[215] **viXra:1705.0242 [pdf]**
*submitted on 2017-05-16 03:47:59*

**Authors:** Marius Coman

**Comments:** 2 Pages.

In this paper I conjecture that for any pair of twin primes [p, q], p ≥ 11, there exist a number n having the sum of its digits equal to 12 such that inserting n after the first digit of p respectively q are obtained two primes (almost always twins, as in the case [1481, 1483] where n = 48 is inserted in [11, 13], beside the case that the first digit of twins is different, as in the case [5669, 6661] where n = 66 is inserted in [59, 61]).

**Category:** Number Theory

[214] **viXra:1705.0241 [pdf]**
*submitted on 2017-05-15 10:36:47*

**Authors:** George Rajna

**Comments:** 33 Pages.

In a recent experiment at EPFL, a microwave resonator, a circuit that supports electric signals oscillating at a resonance frequency, is coupled to the vibrations of a metallic micro-drum. [20]
Researchers at the Institute of Solid State Physics map out a radically new approach for designing optical and electronic properties of materials in Advanced Materials. [19]
Now MIT physicists have found that a flake of graphene, when brought in close proximity with two superconducting materials, can inherit some of those materials' superconducting qualities. As graphene is sandwiched between superconductors, its electronic state changes dramatically, even at its center. [18]
EPFL scientists have now carried out a study on a lithium-containing copper oxide and have found that its electrons are 2.5 times lighter than was predicted by theoretical calculations. [17]
Washington State University physicists have created a fluid with negative mass, which is exactly what it sounds like. Push it, and unlike every physical object in the world we know, it doesn't accelerate in the direction it was pushed. It accelerates backwards. [16]
When matter is cooled to near absolute zero, intriguing phenomena emerge. These include supersolidity, where crystalline structure and frictionless flow occur together. ETH researchers have succeeded in realising this strange state experimentally for the first time. [15]
Helium atoms are loners. Only if they are cooled down to an extremely low temperature do they form a very weakly bound molecule. In so doing, they can keep a tremendous distance from each other thanks to the quantum-mechanical tunnel effect. [14]
Inside a new exotic crystal, physicist Martin Mourigal has observed strong indications of "spooky" action, and lots of it. The results of his experiments, if corroborated over time, would mean that the type of crystal is a rare new material that can house a quantum spin liquid. [13]
An international team of researchers have found evidence of a mysterious new state of matter, first predicted 40 years ago, in a real material. This state, known as a quantum spin liquid, causes electrons - thought to be indivisible building blocks of nature - to break into pieces. [12]
In a single particle system, the behavior of the particle is well understood by solving the Schrödinger equation. Here the particle possesses wave nature characterized by the de Broglie wave length. In a many particle system, on the other hand, the particles interact each other in a quantum mechanical way and behave as if they are "liquid". This is called quantum liquid whose properties are very different from that of the single particle case. [11]
Quantum coherence and quantum entanglement are two landmark features of quantum physics, and now physicists have demonstrated that the two phenomena are "operationally equivalent"—that is, equivalent for all practical purposes, though still conceptually distinct. This finding allows physicists to apply decades of research on entanglement to the more fundamental but less-well-researched concept of coherence, offering the possibility of advancing a wide range of quantum technologies. [10]
The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron’s spin also, building the Bridge between the Classical and Quantum Theories.
The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry.
The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory.
The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** Condensed Matter

[213] **viXra:1705.0240 [pdf]**
*submitted on 2017-05-15 11:18:45*

**Authors:** George R. Briggs

**Comments:** 2 Pages.

Abstract: The early universe's deficit of of dark matter is simply explained using up-to-date theory: dark matter entered new recycled active galaxies at their supermassive black holes at an overall steady rate that has continued unchanged to the present day

**Category:** Relativity and Cosmology

[212] **viXra:1705.0239 [pdf]**
*submitted on 2017-05-15 12:12:07*

**Authors:** Jeffrey Joseph Wolynski

**Comments:** 1 Page.

The velocity of hydrogen gas at temperatures hypothesized to have been on Earth and other rocky bodies during their formation is well beyond the escape velocity of Earth and other smaller bodies. So there is a paradox, how exactly did rocks, minerals and water oceans form, which contain hydrogen, if hydrogen would have escaped Earth’s and other bodies’ gravitational fields?

**Category:** Astrophysics

[211] **viXra:1705.0238 [pdf]**
*submitted on 2017-05-15 07:23:33*

**Authors:** George Rajna

**Comments:** 25 Pages.

Research from The University of Manchester has thrown new light on the use of miniaturised 'heat engines' that could one day help power nanoscale machines like quantum computers. [15] The likelihood of seeing quantum systems violating the second law of thermodynamics has been calculated by UCL scientists. [14] For more than a century and a half of physics, the Second Law of Thermodynamics, which states that entropy always increases, has been as close to inviolable as any law we know. In this universe, chaos reigns supreme. [13] Physicists have shown that the three main types of engines (four-stroke, two-stroke, and continuous) are thermodynamically equivalent in a certain quantum regime, but not at the classical level. [12] For the first time, physicists have performed an experiment confirming that thermodynamic processes are irreversible in a quantum system—meaning that, even on the quantum level, you can't put a broken egg back into its shell. The results have implications for understanding thermodynamics in quantum systems and, in turn, designing quantum computers and other quantum information technologies. [11] Disorder, or entropy, in a microscopic quantum system has been measured by an international group of physicists. The team hopes that the feat will shed light on the "arrow of time": the observation that time always marches towards the future. The experiment involved continually flipping the spin of carbon atoms with an oscillating magnetic field and links the emergence of the arrow of time to quantum fluctuations between one atomic spin state and another. [10] Mark M. Wilde, Assistant Professor at Louisiana State University, has improved this theorem in a way that allows for understanding how quantum measurements can be approximately reversed under certain circumstances. The new results allow for understanding how quantum information that has been lost during a measurement can be nearly recovered, which has potential implications for a variety of quantum technologies. [9] Today, we are capable of measuring the position of an object with unprecedented accuracy, but quantum physics and the Heisenberg uncertainty principle place fundamental limits on our ability to measure. Noise that arises as a result of the quantum nature of the fields used to make those measurements imposes what is called the "standard quantum limit." This same limit influences both the ultrasensitive measurements in nanoscale devices and the kilometer-scale gravitational wave detector at LIGO. Because of this troublesome background noise, we can never know an object's exact location, but a recent study provides a solution for rerouting some of that noise away from the measurement. [8] The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory.

**Category:** Thermodynamics and Energy

[210] **viXra:1705.0237 [pdf]**
*replaced on 2017-05-17 02:21:23*

**Authors:** Gordon Liu

**Comments:** 8 pages

Although mass is a very common and fundamental concept, but the problem of mass is still one of the key problems of modern Physics, up to the present the experts are not able to reach a consensus. In this paper, we have discussed the problems relating to mass, energy and matter, and been aware that mass is neither the amount of matter an object has, nor the measure of inertia and the source of gravitational field, whereas energy is the measure of the inertia of an object and also is the source of gravitational field. As before, Mass being used to measure the inertia and to calculate gravitational force is just an approximate method only for very slowly moving bodies and particles for which the rest energy is much larger than the kinetic energy. Actually the concept of mass is a superfluous artificial concept, just rest energy divided by a constant (If selecting c=1, the mass is exactly equal to the rest energy), does not have any other meaning. If the concept of mass is completely superseded by energy(or more precisely, rest energy), the physical equations are completely perfect, their meanings are clearer, and the puzzles such as the relationship of mass and energy, nature of matter, the essence of the weak equivalence principle, the physical meaning of Higgs mechanism, etc., can be made clearer.

**Category:** Relativity and Cosmology

[209] **viXra:1705.0236 [pdf]**
*submitted on 2017-05-15 08:53:13*

**Authors:** Gordon Liu

**Comments:** 20 pages, PHYSICS ESSAYS 27, 1 (2014)

The success of Special Relativity (SR) comes from the requirement of Lorentz co-variance to all physical equations. The explanation with regard to the Lorentz co-variance is based on two hypotheses, namely the principle of special relativity and the constancy of the speed of light. However, the statements of the principle of special relativity are various and confusing. The co-variance of physical equations and the equality of inertial frames of reference are mixed up. The equality of inertial frames of reference is obvious, but the co-variance of the physical equations is a more advanced requirement. Additionally, the way that the propagation property of light is placed in a central position of SR has caused people misunderstandings towards space-time, and also there is a logical circularity between the measurement of speed of light and the synchronization of clocks. These have obstructed to correctly extend the theory of space-time from an inertial frame of reference to a non-inertial frame of reference. These are the main reasons why many people criticize SR. In present paper, the two hypotheses have been discussed in detail and a new requirement to the equations of Physics has been proposed. The requirement is the Requirement of Special Completeness, namely, the physical equations used to describe the dynamics of matter and/or fields should include the descriptions that not only the matter and/or fields are at rest relative to an inertial frame of reference, but also they move relative to this frame. Basing on this requirement and the equality of the inertial frames of reference, we can approach to SR. Thereby let the theory of Lorentz co-variance has a clear and solid foundation. The constancy of the speed of light is just a deduction, not a premise. The Lorentz co-variance is just a characteristic of the Special Complete equations. Maxwell equations automatically satisfy the Lorentz trans-formations without any modification, while Newton law of gravity does not, because Newton law of gravity is not Special Complete and Maxwell equations are. The new approach has paved a road leading towards the generalizing of the theory of space-time from the inertial frame of reference to non-inertial frame of reference without considering gravitation.

**Category:** Relativity and Cosmology

[208] **viXra:1705.0235 [pdf]**
*submitted on 2017-05-15 08:55:07*

**Authors:** George Rajna

**Comments:** 41 Pages.

An international team including researchers from MIPT has shown that iodide phasing—a long-established technique in structural biology—is universally applicable to membrane protein structure determination. [23] Scientists in Greece have devised a new form of biometric identification that relies on humans' ability to see flashes of light containing just a handful of photons. [22] A research team led by Professor CheolGi Kim has developed a biosensor platform using magnetic patterns resembling a spider web with detection capability 20 times faster than existing biosensors. [21] Researchers at Columbia University have made a significant step toward breaking the so-called "color barrier" of light microscopy for biological systems, allowing for much more comprehensive, system-wide labeling and imaging of a greater number of biomolecules in living cells and tissues than is currently attainable. [20] Scientists around the Nobel laureate Stefan Hell at the Max Planck Institute for Biophysical Chemistry in Göttingen have now achieved what was for a long time considered impossible – they have developed a new fluorescence microscope, called MINFLUX, allowing, for the first time, to optically separate molecules, which are only nanometers (one millionth of a millimeter) apart from each other. [19] Dipole orientation provides new dimension in super-resolution microscopy [18] Fluorescence is an incredibly useful tool for experimental biology and it just got easier to tap into, thanks to the work of a group of University of Chicago researchers. [17] Molecules that change colour can be used to follow in real-time how bacteria form a protective biofilm around themselves. This new method, which has been developed in collaboration between researchers at Linköping University and Karolinska Institutet in Sweden, may in the future become significant both in medical care and the food industry, where bacterial biofilms are a problem. [16] Researchers led by Carnegie Mellon University physicist Markus Deserno and University of Konstanz (Germany) chemist Christine Peter have developed a computer simulation that crushes viral capsids. By allowing researchers to see how the tough shells break apart, the simulation provides a computational window for looking at how viruses and proteins assemble. [15]

**Category:** Physics of Biology

[207] **viXra:1705.0234 [pdf]**
*replaced on 2017-06-09 21:02:32*

**Authors:** Gordon Liu

**Comments:** 9 Pages. update version

In present paper, we have proposed an alternative theory on the spacetime of non-inertial reference frame (NRF) which bases on the requirement of general completeness (RGC) and the principle of equality of all reference frames (PERF). The RGC is that the physical equations used to describe the dynamics of matter and/or fields should include the descriptions that not only the matter and/or fields are at rest, but also they move relative to this reference frame, and the structure of the spacetime of reference frame has been considered. The PERF is that any reference frame can be used to describe the motion of matter and/or fields. The spacetime of NRF is inhomogeneous and deformed caused by the accelerating motion of the reference frame. The inertial force is the manifestation of deformed spacetime. The Riemann curvature tensor of the spacetime of NRF equals zero, but the Riemann-Christoffel symbol never vanishs no matter what coordinate system is selected in the NRF. The physical equations satisfied the RGC remain covariance under the coordinate transformation between the reference frames. Mach’s principle is incorrect. The problem of spacetime of NRF can be solved without considering gravitation.

**Category:** Relativity and Cosmology

[206] **viXra:1705.0233 [pdf]**
*submitted on 2017-05-15 10:03:23*

**Authors:** Gordon Liu

**Comments:** 12 Pages. International Journal of Astronomy and Astrophysics, 2013, 3, 8-19

Let the coordinate system of flat space-time to absorb a second rank tensor field of the flat space-time deforming into a Riemannian space-time, namely, the tensor field is regarded as a metric tensor with respect to the coordinate system . After done this, is not the coordinate system of flat space-time anymore, but is the coordinate system of the new Riemannian space-time. The inverse operation also can be done. According to these notions, the concepts of the absorption operation and the desorption operation are proposed. These notions are actually compatible with Einstein's equivalence principle. By using these concepts, the relationships of the Riemannian space-time, the de Donder conditions and the gravitational field in flat space-time are analyzed and elaborated. The essential significance of the de Donder conditions (the harmonic conditions or gauge) is to desorb the tensor field of gravitation from the Riemannian space-time to the Minkowski space-time with the Cartesian coordinates. Einstein equations with de Donder conditions can be solved in flat space-time. Base on Fock's works, the equations of gravitational field in flat space-time are obtained, and the tensor expression of the energy-momentum of gravitational field is found. They all satisfy the global Lorentz covariance.

**Category:** Relativity and Cosmology

[205] **viXra:1705.0232 [pdf]**
*submitted on 2017-05-15 10:06:53*

**Authors:** George Rajna

**Comments:** 33 Pages.

When we look at a painting, how do we know it's a genuine piece of art? [23] Researchers from the University of Illinois at Urbana-Champaign have demonstrated a new level of optical isolation necessary to advance on-chip optical signal processing. The technique involving light-sound interaction can be implemented in nearly any photonic foundry process and can significantly impact optical computing and communication systems. [22] City College of New York researchers have now demonstrated a new class of artificial media called photonic hypercrystals that can control light-matter interaction in unprecedented ways. [21] Experiments at the Institute of Physical Chemistry of the Polish Academy of Sciences in Warsaw prove that chemistry is also a suitable basis for storing information. The chemical bit, or 'chit,' is a simple arrangement of three droplets in contact with each other, in which oscillatory reactions occur. [20] Researchers at Sandia National Laboratories have developed new mathematical techniques to advance the study of molecules at the quantum level. [19] Correlation functions are often employed to quantify the relationships among interdependent variables or sets of data. A few years ago, two researchers proposed a property-testing problem involving Forrelation for studying the query complexity of quantum devices. [18] A team of researchers from Australia and the UK have developed a new theoretical framework to identify computations that occupy the 'quantum frontier'—the boundary at which problems become impossible for today's computers and can only be solved by a quantum computer. [17] Scientists at the University of Sussex have invented a groundbreaking new method that puts the construction of large-scale quantum computers within reach of current technology. [16] Physicists at the University of Bath have developed a technique to more reliably produce single photons that can be imprinted with quantum information. [15] Now a researcher and his team at Tyndall National Institute in Cork have made a 'quantum leap' by developing a technical step that could enable the use of quantum computers sooner than expected. [14]

**Category:** Quantum Physics

[204] **viXra:1705.0231 [pdf]**
*submitted on 2017-05-15 06:37:31*

**Authors:** George Rajna

**Comments:** 21 Pages.

The research team recently succeeded for the first time in precisely controlling the transition temperature of superconducting atomic layers using organic molecules. [31] For the first time, physicists have experimentally validated a 1959 conjecture that places limits on how small superconductors can be. [30] A new finding by physicists at MIT and in Israel shows that under certain specialized conditions, electrons can speed through a narrow opening in a piece of metal more easily than traditional theory says is possible. [29] Researchers have found a way to trigger the innate, but previously hidden, ability of graphene to act as a superconductor-meaning that it can be made to carry an electrical current with zero resistance. [28] Researchers in Japan have found a way to make the 'wonder material' graphene superconductive-which means electricity can flow through it with zero resistance. The new property adds to graphene's already impressive list of attributes, like the fact that it's stronger than steel, harder than diamond, and incredibly flexible. [27] Superconductivity is a rare physical state in which matter is able to conduct electricity—maintain a flow of electrons—without any resistance. It can only be found in certain materials, and even then it can only be achieved under controlled conditions of low temperatures and high pressures. New research from a team including Carnegie's Elissaios Stavrou, Xiao-Jia Chen, and Alexander Goncharov hones in on the structural changes underlying superconductivity in iron arsenide compounds—those containing iron and arsenic. [26] This paper explains the magnetic effect of the superconductive current from the observed effects of the accelerating electrons, causing naturally the experienced changes of the electric field potential along the electric wire. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the wave particle duality and the electron's spin also, building the bridge between the Classical and Quantum Theories. The changing acceleration of the electrons explains the created negative electric field of the magnetic induction, the Higgs Field, the changing Relativistic Mass and the Gravitational Force, giving a Unified Theory of the physical forces. Taking into account the Planck Distribution Law of the electromagnetic oscillators also, we can explain the electron/proton mass rate and the Weak and Strong Interactions.

**Category:** Quantum Physics

[203] **viXra:1705.0230 [pdf]**
*submitted on 2017-05-14 14:57:38*

**Authors:** Al-Shobaki1, Mazen J.;Abu-Naser, Samy S.; Kassab, Mohammed khair I.

**Comments:** 15 pages

The research aims to identify the status of the application of electronic document management system in governmental institutions – the study was applied on the Palestinian Pension Agency. The population of this study is composed of all employees in the Palestinian Pension Agency. In order to achieve the objectives of the study, the researchers used the descriptive and analytical approach, through which try to describe the phenomenon of the subject of the study, analyze the data and the relationship between the components and the views put around it. Census method was used due to the small size of the study population and ease of access to the target group. (108) questionnaires were distributed to all members of the study population, were (65) employees in the Gaza Strip and (43) employees in the West Bank. All questionnaires were recovered.
The study found the following results: There were no statistically significant differences in the members of the population in response to differences in the study about the reality of the application of electronic document management system in governmental institutions - case study on the Palestinian Pension Authority due to the age. There are no statistically significant differences in population members in response to the reality of the application of electronic document management system in governmental institutions - case Study on the Palestinian Pension Authority due to the variable nature of the job. As well as there are no statistically significant differences in the members of the population in response to the study about the reality of the application of electronic document management system in governmental institutions - case study on the Palestinian Pension Authority due to the variable of specialization. There are statistically significant differences in the study about the reality of the application of electronic document management system in governmental institutions - case study on the Palestinian Pension Authority due to Qualification variable for the benefit of members of the population study who are holding a Bachelor degree. There are statistically significant differences in the study about the reality of the application of electronic document management system in governmental institutions – case study on the Palestinian Pension Authority due to the variable number of years of experience for the benefit of members of the study population who have experience between 11-15 years.

**Category:** Social Science

[202] **viXra:1705.0229 [pdf]**
*submitted on 2017-05-14 15:03:41*

**Authors:** Al-Shobaki1, Mazen J.;Abu-Naser, Samy S.; Ammar, Tarek M.

**Comments:** 18 pages

The aim of the study is to identify the degree of administrative transparency in the Palestinian higher educational institutions in the Gaza Strip. In the study, the researchers adopted a descriptive and analytical method. The research population consisted of administrative staff, whether academic or administrative, except for those in senior management or the university council. The study population reached 392 employees. A random sample was selected (197). The number of questionnaires recovered was (160) with a recovery rate of (81.2%). The researchers used a questionnaire for the data collection and were treated using SPSS to obtain the results.
The results show that there is no significant difference between male responses and female responses due to gender variable. The results also confirm that there is no significant difference between respondents' responses due to the age variable. The results also showed a significant difference between respondents' responses attributed to the university variable. There is a fundamental difference between respondents' responses attributed to the scientifically qualified variable. The results also confirmed a significant difference between respondents' responses attributed to the management level variable. The results also confirmed a significant difference between respondents' responses due to variable years of service.
The research reached a number of recommendations, the most important of which is: The necessity of Palestinian universities to adhere to the application of transparency standards in all university activities. The need to benefit from regional and international experience in the application of transparency systems within universities and to examine the possibility of applying these systems in our universities. As well as the need to engage in the program of teaching transparency in universities, as it is confirmed that only five universities participated in this experiment. The importance of raising awareness among the employees of Palestinian universities to clarify the foundations of building transparency and its dimensions to represent the active supporter through workshops and seminars.

**Category:** Social Science

[201] **viXra:1705.0228 [pdf]**
*submitted on 2017-05-14 15:07:36*

**Authors:** Al-Shobaki1, Mazen J.;Abu-Naser, Samy S.

**Comments:** 14 pages

This study aimed to identify the degree of use of the capabilities of decision-support systems in Palestinian institutions higher education, Aqsa University in Gaza - a case study. The study used a analytical descriptive approach, and the researchers used the of questionnaire tool to collect the data, the researchers using stratified random sample distributed (150) questioners to the study population and (126) was obtained back with rate of 84%.
The study showed that the most important results are: that senior management supports the existence of decision support systems and that there is approval by the respondents on the paragraphs of the use of the capabilities of decision support systems in general. And that there are no significant differences between the averages of the answers of respondents differences about the degree of use of decision support systems capabilities attributed to personal data.
The study also concluded a series of recommendations including: increasing the adoption of the senior management decision support in their decision-making systems. And increased regulatory attention to the potential available to decision support systems directly to the senior management in the Palestinian universities in the Gaza Strip. There is an increased interest in the physical and technical possibilities available for the use of decision support systems. There is an increased interest in human potential available for the use of decision support systems. Investment of information available to universities in building the capacities of integration techniques and other information technology capabilities. The empowerment of human resources in universities and participating in making decisions concerning the construction of the capabilities of information technology.

**Category:** Social Science

[200] **viXra:1705.0227 [pdf]**
*submitted on 2017-05-14 15:08:42*

**Authors:** Jeffrey Joseph Wolynski

**Comments:** 3 Pages.

Some notes are referenced to put together a much more cohesive picture of stellar evolution using a few ideas presented by Pierre-Marie Robitaille in his Liquid Metallic Hydrogen Solar Model as compared to Stellar Metamorphosis.

**Category:** Astrophysics

[199] **viXra:1705.0226 [pdf]**
*submitted on 2017-05-14 17:15:50*

**Authors:** Farzad Didehvar

**Comments:** 9 Pages.

Throughout this paper, we are trying to show how and why our Mathematical frame-work seems inappropriate to solve problems in Theory of Computation. More exactly, the concept of turning back in time in paradoxes causes inconsistency in modeling of the concept of Time in some semantic situations. As we see in the first chapter, by introducing a version of “Unexpected Hanging Paradox”, first we attempt to open a new explanation for some paradoxes. In the second step, by applying this paradox, it is demonstrated that any formalized system for the Theory of Computation based on Classical Logic and Turing Model of Computation leads us to a contradiction. We conclude that our mathematical frame work is inappropriate for Theory of Computation. Furthermore, the result provides us a reason that many problems in Complexity Theory resist to be solved.
.

**Category:** Set Theory and Logic

[198] **viXra:1705.0225 [pdf]**
*replaced on 2018-01-07 22:35:35*

**Authors:** Jody A. Geiger

**Comments:** 28 Pages.

Providing an explanation of dark energy, inflation and the big bang has proved difficult and elusive. Using the principles of Informativity—a model based on counts of fundamental measures of length, mass and time—an understanding of the expansion of our universe is resolved. Several expressions—mass, density, age of the universe, Hubble’s constant, observable matter, dark energy, inflation, cosmic microwave background and the processes that led to the big bang—are presented that exemplify the approach and mathematical procedures. The postulates of Informativity change our understanding of space and provide a framework with which to understand each of these phenomena.

**Category:** Relativity and Cosmology

[197] **viXra:1705.0224 [pdf]**
*submitted on 2017-05-15 02:29:14*

**Authors:** Marius Coman

**Comments:** 2 Pages.

In this paper I conjecture that for any prime p, p ≥ 7, there exist a prime q obtained inserting a number n with the sum of digits equal to 12 before the last digit of p.

**Category:** Number Theory

[196] **viXra:1705.0223 [pdf]**
*submitted on 2017-05-15 03:07:04*

**Authors:** Arturo Tozzi, James F Peters

**Comments:** 13 Pages.

A novel daemon-based architecture is introduced to elucidate some brain functions, such as pattern recognition during human perception and mental interpretation of visual scenes. By taking into account the concepts of invariance and persistence in topology, we introduce a Selfridge pandemonium variant of brain activity that takes into account a novel feature, namely, extended feature daemons that, in addition to the usual recognition of short straight as well as curved lines, recognize topological features of visual scene shapes, such as shape interior, density and texture. A series of transformations can be gradually applied to a pattern, in particular to the shape of an object, without affecting its invariant properties, such as its boundedness and connectedness of the parts of a visual scene. We also introduce another Pandemonium implementation: low-level representations of objects can be mapped to higher-level views (our mental interpretations), making it possible to construct a symbolic multidimensional representation of the environment. The representations can be projected continuously to an object that we have seen and continue to see, thanks to the mapping from shapes in our memory to shapes in Euclidean space. A multidimensional vista detectable by the brain (brainscapes) results from the presence of daemons (mind channels) that detect not only ordinary views of the shapes in visual scenes, but also the features of the shapes. Although perceived shapes are 3-dimensional (3+1 dimensional, if we include time), shape features (volume, colour, contour, closeness, texture, and so on) lead to n-dimensional brainscapes, We arrive at 5 as a minimum shape feature space, since every visual shape has at least a contour in space-time. We discuss the advantages of our parallel, hierarchical model in pattern recognition, computer vision and biological nervous system’s evolution.

**Category:** Artificial Intelligence

[195] **viXra:1705.0222 [pdf]**
*submitted on 2017-05-15 03:34:42*

**Authors:** George Rajna

**Comments:** 31 Pages.

Researchers at the Institute of Solid State Physics map out a radically new approach for designing optical and electronic properties of materials in Advanced Materials. [19] Now MIT physicists have found that a flake of graphene, when brought in close proximity with two superconducting materials, can inherit some of those materials' superconducting qualities. As graphene is sandwiched between superconductors, its electronic state changes dramatically, even at its center. [18] EPFL scientists have now carried out a study on a lithium-containing copper oxide and have found that its electrons are 2.5 times lighter than was predicted by theoretical calculations. [17] Washington State University physicists have created a fluid with negative mass, which is exactly what it sounds like. Push it, and unlike every physical object in the world we know, it doesn't accelerate in the direction it was pushed. It accelerates backwards. [16] When matter is cooled to near absolute zero, intriguing phenomena emerge. These include supersolidity, where crystalline structure and frictionless flow occur together. ETH researchers have succeeded in realising this strange state experimentally for the first time. [15] Helium atoms are loners. Only if they are cooled down to an extremely low temperature do they form a very weakly bound molecule. In so doing, they can keep a tremendous distance from each other thanks to the quantum-mechanical tunnel effect. [14] Inside a new exotic crystal, physicist Martin Mourigal has observed strong indications of "spooky" action, and lots of it. The results of his experiments, if corroborated over time, would mean that the type of crystal is a rare new material that can house a quantum spin liquid. [13] An international team of researchers have found evidence of a mysterious new state of matter, first predicted 40 years ago, in a real material. This state, known as a quantum spin liquid, causes electrons-thought to be indivisible building blocks of nature-to break into pieces. [12] In a single particle system, the behavior of the particle is well understood by solving the Schrödinger equation. Here the particle possesses wave nature characterized by the de Broglie wave length. In a many particle system, on the other hand, the particles interact each other in a quantum mechanical way and behave as if they are "liquid". This is called quantum liquid whose properties are very different from that of the single particle case. [11] Quantum coherence and quantum entanglement are two landmark features of quantum physics, and now physicists have demonstrated that the two phenomena are "operationally equivalent"—that is, equivalent for all practical purposes, though still conceptually distinct. This finding allows physicists to apply decades of research on entanglement to the more fundamental but less-well-researched concept of coherence, offering the possibility of advancing a wide range of quantum technologies. [10] The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** Condensed Matter

[194] **viXra:1705.0221 [pdf]**
*submitted on 2017-05-15 03:41:04*

**Authors:** Marius Coman

**Comments:** 2 Pages.

In this paper I conjecture that for any prime p, p ≥ 5, there exist a prime q obtained inserting a number n with the sum of digits equal to 12 after the first digit of p.

**Category:** Number Theory

[193] **viXra:1705.0220 [pdf]**
*submitted on 2017-05-14 08:02:21*

**Authors:** George Rajna

**Comments:** 20 Pages.

For the first time, scientists have subjected quantum entanglement to extreme levels of acceleration, and there's nothing fragile about this "spooky action at a distance"-it's way more robust than we thought. [13] Now, new research in the American Physical Society's journal Physical Review Letters brings aspects of the two together in an experiment that shows, for the first time, that gravity stretches and squeezes quantum objects through tidal forces. [12] Physicists have performed a test designed to investigate the effects of the expansion of the universe—hoping to answer questions such as "does the expansion of the universe affect laboratory experiments?", "might this expansion change the lengths of solid objects and the time measured by atomic clocks differently, in violation of Einstein's equivalence principle?", and "does spacetime have a foam-like structure that slightly changes the speed of photons over time?", an idea that could shed light on the connection between general relativity and quantum gravity. [11] Einstein's equivalence principle states that an object in gravitational free fall is physically equivalent to an object that is accelerating with the same amount of force in the absence of gravity. This principle lies at the heart of general relativity and has been experimentally tested many times. Now in a new paper, scientists have experimentally demonstrated a conceptually new way to test the equivalence principle that could detect the effects of a relatively new concept called spin-gravity coupling. [10] A recent peer-reviewed paper by physicist James Franson from the University of Maryland in the US has initiated a stir among physics community. Issued in the New Journal of Physics, the paper points to evidence proposing that the speed of light as defined by the theory of general relativity, is slower than originally thought. [9] Gravitational time dilation causes decoherence of composite quantum systems. Even if gravitons are there, it's probable that we would never be able to perceive them. Perhaps, assuming they continue inside a robust model of quantum gravity, there may be secondary ways of proving their actuality. [7] The magnetic induction creates a negative electric field, causing an electromagnetic inertia responsible for the relativistic mass change; it is the mysterious Higgs Field giving mass to the particles. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The self maintained electric potential of the accelerating charges equivalent with the General Relativity space-time curvature, and since it is true on the quantum level also, gives the base of the Quantum Gravity. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory.

**Category:** Quantum Physics

[192] **viXra:1705.0219 [pdf]**
*replaced on 2017-06-21 15:44:27*

**Authors:** D. Chakalov

**Comments:** 7 Pages. Three references added. Comments welcome.

Brief outline of Penrose diagram and related topics.

**Category:** Relativity and Cosmology

[191] **viXra:1705.0218 [pdf]**
*submitted on 2017-05-14 09:42:59*

**Authors:** Madonna-Megara Holloway

**Comments:** 10 Pages.

The Unification of Quantum Mechanics, General Relativity and Consciousness – An Excerpt from The Secret Doctrine Volume IV, The Nature of Everything

**Category:** Quantum Physics

[190] **viXra:1705.0217 [pdf]**
*submitted on 2017-05-14 04:25:18*

**Authors:** Tal Ben Yakar

**Comments:** 6 Pages.

Finding the optimal driving route has attracted considerable attention in recent years, the problem sounds simple however different companies these days, taxi alternatives companies like Uber and Via trying to find what is the best route to drive find it as a very challenging problem. Ridesharing and maps companies like HERE, navigation companies like waze and public transportation companies like moovit and others. AI robots in addition, need to have the ability to route in the optimal manner. In this work we formulate the problem of finding optimal routes as an optimization problem and come up with a neat, low memory and fast solution to the problem using machine learning algorithms.

**Category:** Artificial Intelligence

[189] **viXra:1705.0216 [pdf]**
*submitted on 2017-05-13 16:51:07*

**Authors:** Richard J. Mathar

**Comments:** 4 pages with 7 figures.

The manuscripts provides a novel starting guess for the solution of Kepler's equation
for unknown eccentric anomaly E given the eccentricity e and the mean
anomaly M of an elliptical orbit.

**Category:** Astrophysics

[188] **viXra:1705.0214 [pdf]**
*submitted on 2017-05-13 20:48:18*

**Authors:** Haytham Chibani

**Comments:** 5 Pages.

Single atom cavity quantum electrodynamics grants access to nonclassical photon statistics, while electromagnetically induced transparency exhibits a dark state of long coherence time. The combination of the two produces a new light ﬁeld via four-wave mixing that shows long-lived quantum statistics. We observe the new ﬁeld in the emission from the cavity as a beat with the probe light that together with the control beam and the cavity vacuum is driving the four-wave mixing process. Moreover, the control ﬁeld allows us to tune the new light ﬁeld from antibunching to bunching, demonstrating our all-optical control over the photon-pair emission.

**Category:** Quantum Physics

[187] **viXra:1705.0213 [pdf]**
*submitted on 2017-05-13 20:50:38*

**Authors:** Andrew Beckwith

**Comments:** 8 Pages.

Analyzing a potential violation of a Penrose singularity theorem, via root finders, and other means.

**Category:** Quantum Gravity and String Theory

[186] **viXra:1705.0212 [pdf]**
*submitted on 2017-05-14 02:26:57*

**Authors:** Sjaak Uitterdijk

**Comments:** 3 Pages.

Showing that the Special Theory of Relativity is an untenable theory, many times leads to the reaction that the GPS is so accurate thanks to the STR corrections. This article shows that the supposed relativity errors are by far negligible relative to the errors caused by atmospheric circumstances.

**Category:** Mathematical Physics

[185] **viXra:1705.0211 [pdf]**
*submitted on 2017-05-13 12:24:55*

**Authors:** Rodolfo A. Frino

**Comments:** 4 Pages.

This work refers to a method of generalizing incomplete physical laws through the scale law. Generalization can only be applied when the general law exists but has not yet been discovered. It is remarkable that the very simple methodology described in this paper turns out to be so powerful.

**Category:** Quantum Physics

[184] **viXra:1705.0210 [pdf]**
*replaced on 2017-05-20 07:34:40*

**Authors:** Saenko V.I.

**Comments:** 3 Pages. This is the Russian version, the English one is directed to the peer-reviewed journal

It is proved that the irreducible map according to Franklin consists of 5 regions and, as a consequence, 4 colors are sufficient for colouring any map on the sphere

**Category:** Topology

[183] **viXra:1705.0209 [pdf]**
*submitted on 2017-05-13 07:08:17*

**Authors:** George Rajna

**Comments:** 18 Pages.

Now, new research in the American Physical Society's journal Physical Review Letters brings aspects of the two together in an experiment that shows, for the first time, that gravity stretches and squeezes quantum objects through tidal forces. [12] Physicists have performed a test designed to investigate the effects of the expansion of the universe—hoping to answer questions such as "does the expansion of the universe affect laboratory experiments?", "might this expansion change the lengths of solid objects and the time measured by atomic clocks differently, in violation of Einstein's equivalence principle?", and "does spacetime have a foam-like structure that slightly changes the speed of photons over time?", an idea that could shed light on the connection between general relativity and quantum gravity. [11] Einstein's equivalence principle states that an object in gravitational free fall is physically equivalent to an object that is accelerating with the same amount of force in the absence of gravity. This principle lies at the heart of general relativity and has been experimentally tested many times. Now in a new paper, scientists have experimentally demonstrated a conceptually new way to test the equivalence principle that could detect the effects of a relatively new concept called spin-gravity coupling. [10] A recent peer-reviewed paper by physicist James Franson from the University of Maryland in the US has initiated a stir among physics community. Issued in the New Journal of Physics, the paper points to evidence proposing that the speed of light as defined by the theory of general relativity, is slower than originally thought. [9] Gravitational time dilation causes decoherence of composite quantum systems. Even if gravitons are there, it's probable that we would never be able to perceive them. Perhaps, assuming they continue inside a robust model of quantum gravity, there may be secondary ways of proving their actuality. [7] The magnetic induction creates a negative electric field, causing an electromagnetic inertia responsible for the relativistic mass change; it is the mysterious Higgs Field giving mass to the particles. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The self maintained electric potential of the accelerating charges equivalent with the General Relativity space-time curvature, and since it is true on the quantum level also, gives the base of the Quantum Gravity. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory.

**Category:** Quantum Gravity and String Theory

[182] **viXra:1705.0208 [pdf]**
*submitted on 2017-05-13 04:18:00*

**Authors:** George Rajna

**Comments:** 17 Pages.

Researchers from the University of Central Florida and Boston University have developed a novel approach to solve such difficult computational problems more quickly. [29] By precisely measuring the entropy of a cerium copper gold alloy with baffling electronic properties cooled to nearly absolute zero, physicists in Germany and the United States have gleaned new evidence about the possible causes of high-temperature superconductivity and similar phenomena. [28] Physicists have theoretically shown that a superconducting current of electrons can be induced to flow by a new kind of transport mechanism: the potential flow of information. [27] This paper explains the magnetic effect of the superconductive current from the observed effects of the accelerating electrons, causing naturally the experienced changes of the electric field potential along the electric wire. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the wave particle duality and the electron's spin also, building the bridge between the Classical and Quantum Theories. The changing acceleration of the electrons explains the created negative electric field of the magnetic induction, the Higgs Field, the changing Relativistic Mass and the Gravitational Force, giving a Unified Theory of the physical forces. Taking into account the Planck Distribution Law of the electromagnetic oscillators also, we can explain the electron/proton mass rate and the Weak and Strong Interactions. Since the superconductivity is basically a quantum mechanical phenomenon and some entangled particles give this opportunity to specific matters, like Cooper Pairs or other entanglements, as strongly correlated materials and Exciton-mediated electron pairing, we can say that the secret of superconductivity is the quantum entanglement.

**Category:** Digital Signal Processing

[181] **viXra:1705.0207 [pdf]**
*submitted on 2017-05-12 13:28:24*

**Authors:** George Rajna

**Comments:** 21 Pages.

International team solves mystery of colloidal chains. [13] An international team of researchers have found evidence of a mysterious new state of matter, first predicted 40 years ago, in a real material. This state, known as a quantum spin liquid, causes electrons-thought to be indivisible building blocks of nature-to break into pieces. [12] In a single particle system, the behavior of the particle is well understood by solving the Schrödinger equation. Here the particle possesses wave nature characterized by the de Broglie wave length. In a many particle system, on the other hand, the particles interact each other in a quantum mechanical way and behave as if they are "liquid". This is called quantum liquid whose properties are very different from that of the single particle case. [11] Quantum coherence and quantum entanglement are two landmark features of quantum physics, and now physicists have demonstrated that the two phenomena are "operationally equivalent"—that is, equivalent for all practical purposes, though still conceptually distinct. This finding allows physicists to apply decades of research on entanglement to the more fundamental but less-well-researched concept of coherence, offering the possibility of advancing a wide range of quantum technologies. [10] The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** Condensed Matter

[180] **viXra:1705.0206 [pdf]**
*submitted on 2017-05-12 15:03:41*

**Authors:** George Rajna

**Comments:** 22 Pages.

For the first time, scientists have succeeded in studying the strength of hydrogen bonds in a single molecule using an atomic force microscope. [14] International team solves mystery of colloidal chains. [13] An international team of researchers have found evidence of a mysterious new state of matter, first predicted 40 years ago, in a real material. This state, known as a quantum spin liquid, causes electrons-thought to be indivisible building blocks of nature-to break into pieces. [12] In a single particle system, the behavior of the particle is well understood by solving the Schrödinger equation. Here the particle possesses wave nature characterized by the de Broglie wave length. In a many particle system, on the other hand, the particles interact each other in a quantum mechanical way and behave as if they are "liquid". This is called quantum liquid whose properties are very different from that of the single particle case. [11] Quantum coherence and quantum entanglement are two landmark features of quantum physics, and now physicists have demonstrated that the two phenomena are "operationally equivalent"—that is, equivalent for all practical purposes, though still conceptually distinct. This finding allows physicists to apply decades of research on entanglement to the more fundamental but less-well-researched concept of coherence, offering the possibility of advancing a wide range of quantum technologies. [10] The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** Quantum Physics

[179] **viXra:1705.0205 [pdf]**
*submitted on 2017-05-12 16:11:03*

**Authors:** Evgeny A Novikov

**Comments:** 6 Pages.

New simple and exact analytical solutions of Einstein equations of general relativity (GR) and of Qmoger (quantum modification of GR) equations are obtained. These solutions corresponds to processes with invariant density of enthalpy (energy plus pressure). Interpretation of this solutions in terms of cosmic radiation and production of massive particles, as well as comparison with cosmic data (without fitting), are presented. It is suggested, that isenthalpic processes can be relevant also to excessive radiation from Jupiter and Saturn. Similar processes potentially can be used as a new source of energy on Earth.

**Category:** Quantum Gravity and String Theory

[178] **viXra:1705.0204 [pdf]**
*submitted on 2017-05-12 21:13:03*

**Authors:** Yanming Wei

**Comments:** 5 pages, 0 figure. DOI: 10.13140/RG.2.2.28394.93125

Energy breakeven is the key to utilize fusion energy. This paper predicts Z-pinch based fusion breakeven is possible in near future as long as it is available of a better pulse DC power supply with high voltage and tremendous current than prior LTD (Linear Transformer Driver), but accelerator-based fusion hopeless forever.

**Category:** Nuclear and Atomic Physics

[177] **viXra:1705.0203 [pdf]**
*submitted on 2017-05-12 22:53:16*

**Authors:** ShengYu.Shu

**Comments:** 62 Pages.

I have mainly analyzed the mathematical meaning of non-classical mathematical theory for three fundamental physics equations - Maxwell’s equations, Dirac’s equations, Einstein’s equations from the quantized core theory of ancient China’s Taoism, and found they have some structures described in the core of the theory of ancient China’s Taoism, especially they all obviously own the yin-yang induction structure. This reveals the relations between the ancient China’s Taoism and modern mathematics and physics in a way, which may help us to understand some problems of the fundamental theory of physics.

**Category:** Mathematical Physics

[176] **viXra:1705.0202 [pdf]**
*replaced on 2017-06-07 08:48:04*

**Authors:** Sylwester Kornowski

**Comments:** 5 Pages.

Here, within the Scale-Symmetric Theory (SST), we showed that the Z and W bosons can be created due to three different mechanisms. One mechanism is associated with a transition from electromagnetic interactions to weak interactions of protons with electrons in the presence of dark matter (DM) while the second one concerns a transition from weak interactions of protons to weak interactions of charges of protons, which mimic behaviour of electrons in absence of DM, with muons associated with protons. In the first mechanism, calculated mass of Z is 91.181 GeV whereas of W is 80.427 GeV while in the second mechanism we obtained respectively 91.205 GeV and 80.385 GeV. The third mechanism leads to masses of W bosons equal to 80.473 GeV and 80.380 GeV (mean value is 80.427 GeV). We showed that the recent cosmic-ray antiproton data from AMS-02 concern transitions between different interactions also so the results do not follow from dark-matter annihilation. Emphasize that in an earlier paper, we calculated lifetimes of the Z and W bosons which are very close to experimental data.

**Category:** High Energy Particle Physics

[175] **viXra:1705.0201 [pdf]**
*submitted on 2017-05-13 01:05:28*

**Authors:** Wei Fan

**Comments:** 6 Pages.

In the history of physics, great achievements have been made in the field of electromagnetism making us aware of the existence of charge, current, electric and magnetic fields. However, what is the essence of charge, current, electric and magnetic fields? This remains a mystery. If we can unravel these puzzles, it can not only meet our curiosity, but also contribute to the development of electromagnetism. Fortunately, we have made it and got some interesting findings: an electric charge is the impulse of electronic angular momentum; the quantity of charges is the quantity of electronic angular momentum impulses; a conduction current is a conduction force flow (current is a type of force flow); an electric field is a magnetic field in nature (because triboelectrification produces a magnetic field); a magnetic field is the superposition state of multiple medium impulse moments; the electromagnetic force is medium impulse moments produced by the collision of electronic angular momentum pulses with medium photons .

**Category:** Condensed Matter

[174] **viXra:1705.0200 [pdf]**
*submitted on 2017-05-13 01:15:48*

**Authors:** Wei Fan

**Comments:** 11 Pages.

Newtonian mechanics is a subfield of mechanics that deals with the motion of particle, and is by far the most clear and concise kinematics. Although Newtonian mechanics can well describe the motion of particle, it is not applicable to the motion of bodies other than particle. That is why it could not explain the rotation and revolution of celestial bodies as well as the impetus for their movements. Based on Newtonian mechanics, this paper presented an extended discussion of the motion of non-particle objects and put forth a new kinetic theory that can well explain the rotation and revolution of celestial bodies and the motivating force for their movements. Meanwhile, it provided a feasible way of explaining the origin of gravity.

**Category:** Astrophysics

[173] **viXra:1705.0199 [pdf]**
*submitted on 2017-05-12 11:51:49*

**Authors:** Rodolfo A. Frino

**Comments:** 14 Pages.

This paper explores the scale factors of three laws: (a) the Einstein's relativistic energy
law, (b) Newton's law of universal gravitation and (c) the special universal uncertainty
principle. Two new concepts are defined: complete energy laws and incomplete energy
laws. This investigation shows that the first two laws have scale factors of 1 while the third one has a scale factor of -1. These results could be useful in the future to predict scale factors of new laws of nature.

**Category:** Quantum Physics

[172] **viXra:1705.0198 [pdf]**
*submitted on 2017-05-12 07:28:04*

**Authors:** George Rajna

**Comments:** 14 Pages.

The future of nano-electronics is here. A team of researchers from the Air Force Research Laboratory, Colorado School of Mines, and the Argonne National Laboratory in Illinois have developed a novel method for the synthesis of a composite material that has the potential of vastly improving the electronics used by the Air Force. [28] Physicists have theoretically shown that a superconducting current of electrons can be induced to flow by a new kind of transport mechanism: the potential flow of information. [27] This paper explains the magnetic effect of the superconductive current from the observed effects of the accelerating electrons, causing naturally the experienced changes of the electric field potential along the electric wire. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the wave particle duality and the electron's spin also, building the bridge between the Classical and Quantum Theories. The changing acceleration of the electrons explains the created negative electric field of the magnetic induction, the Higgs Field, the changing Relativistic Mass and the Gravitational Force, giving a Unified Theory of the physical forces. Taking into account the Planck Distribution Law of the electromagnetic oscillators also, we can explain the electron/proton mass rate and the Weak and Strong Interactions. Since the superconductivity is basically a quantum mechanical phenomenon and some entangled particles give this opportunity to specific matters, like Cooper Pairs or other entanglements, as strongly correlated materials and Exciton-mediated electron pairing, we can say that the secret of superconductivity is the quantum entanglement.

**Category:** Condensed Matter

[171] **viXra:1705.0197 [pdf]**
*submitted on 2017-05-12 08:21:12*

**Authors:** George Rajna

**Comments:** 15 Pages.

By precisely measuring the entropy of a cerium copper gold alloy with baffling electronic properties cooled to nearly absolute zero, physicists in Germany and the United States have gleaned new evidence about the possible causes of high-temperature superconductivity and similar phenomena. [28]
Physicists have theoretically shown that a superconducting current of electrons can be induced to flow by a new kind of transport mechanism: the potential flow of information. [27]
This paper explains the magnetic effect of the superconductive current from the observed effects of the accelerating electrons, causing naturally the experienced changes of the electric field potential along the electric wire. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the wave particle duality and the electron’s spin also, building the bridge between the Classical and Quantum Theories.
The changing acceleration of the electrons explains the created negative electric field of the magnetic induction, the Higgs Field, the changing Relativistic Mass and the Gravitational Force, giving a Unified Theory of the physical forces. Taking into account the Planck Distribution Law of the electromagnetic oscillators also, we can explain the electron/proton mass rate and the Weak and Strong Interactions.
Since the superconductivity is basically a quantum mechanical phenomenon and some entangled particles give this opportunity to specific matters, like Cooper Pairs or other entanglements, as strongly correlated materials and Exciton-mediated electron pairing, we can say that the secret of superconductivity is the quantum entanglement.

**Category:** Quantum Physics

[170] **viXra:1705.0196 [pdf]**
*submitted on 2017-05-12 09:17:02*

**Authors:** George Rajna

**Comments:** 26 Pages.

Researchers at ETH have now developed a method by which such frequency combs can be created much more simply and cheaply than before. [17]
A novel way to harness lasers and plasmas may give researchers new ways to explore outer space and to examine bugs, tumors and bones back on planet Earth. [16]
A team of researchers at Harvard University has successfully cooled a three-atom molecule down to near absolute zero for the first time. [15]
A research team led by UCLA electrical engineers has developed a new technique to control the polarization state of a laser that could lead to a new class of powerful, high-quality lasers for use in medical imaging, chemical sensing and detection, or fundamental science research. [14]
UCLA physicists have shown that shining multicolored laser light on rubidium atoms causes them to lose energy and cool to nearly absolute zero. This result suggests that atoms fundamental to chemistry, such as hydrogen and carbon, could also be cooled using similar lasers, an outcome that would allow researchers to study the details of chemical reactions involved in medicine. [13]
Powerful laser beams, given the right conditions, will act as their own lenses and "self-focus" into a tighter, even more intense beam. University of Maryland physicists have discovered that these self-focused laser pulses also generate violent swirls of optical energy that strongly resemble smoke rings. [12]
Electrons fingerprint the fastest laser pulses. [11]
A team of researchers with members from Germany, the U.S. and Russia has found a way to measure the time it takes for an electron in an atom to respond to a pulse of light. [10]
As an elementary particle, the electron cannot be broken down into smaller particles, at least as far as is currently known. However, in a phenomenon called electron fractionalization, in certain materials an electron can be broken down into smaller "charge pulses," each of which carries a fraction of the electron's charge. Although electron fractionalization has many interesting implications, its origins are not well understood. [9]
New ideas for interactions and particles: This paper examines the possibility to origin the Spontaneously Broken Symmetries from the Planck Distribution Law. This way we get a Unification of the Strong, Electromagnetic, and Weak Interactions from the interference occurrences of oscillators. Understanding that the relativistic mass change is the result of the magnetic induction we arrive to the conclusion that the Gravitational Force is also based on the electromagnetic forces, getting a Unified Relativistic Quantum Theory of all 4 Interactions.

**Category:** Quantum Physics

[169] **viXra:1705.0195 [pdf]**
*submitted on 2017-05-12 09:45:21*

**Authors:** God Bo

**Comments:** 3 Pages.

A new theory of quantum MASM, approved by hundreds of professors.

**Category:** Quantum Physics

[168] **viXra:1705.0194 [pdf]**
*submitted on 2017-05-12 04:58:46*

**Authors:** George Rajna

**Comments:** 31 Pages.

Engineering researchers at the University of Minnesota have developed a revolutionary process for 3D printing stretchable electronic sensory devices that could give robots the ability to feel their environment. The discovery is also a major step forward in printing electronics on real human skin. [18] Researchers from France and the University of Arkansas have created an artificial synapse capable of autonomous learning, a component of artificial intelligence. [17] Intelligent machines of the future will help restore memory, mind your children, fetch your coffee and even care for aging parents. [16] Unlike experimental neuroscientists who deal with real-life neurons, computational neuroscientists use model simulations to investigate how the brain functions. [15] A pair of physicists with ETH Zurich has developed a way to use an artificial neural network to characterize the wave function of a quantum many-body system. [14] A team of researchers at Google's DeepMind Technologies has been working on a means to increase the capabilities of computers by combining aspects of data processing and artificial intelligence and have come up with what they are calling a differentiable neural computer (DNC.) In their paper published in the journal Nature, they describe the work they are doing and where they believe it is headed. To make the work more accessible to the public team members, Alexander Graves and Greg Wayne have posted an explanatory page on the DeepMind website. [13] Nobody understands why deep neural networks are so good at solving complex problems. Now physicists say the secret is buried in the laws of physics. [12] A team of researchers working at the University of California (and one from Stony Brook University) has for the first time created a neural-network chip that was built using just memristors. In their paper published in the journal Nature, the team describes how they built their chip and what capabilities it has. [11] A team of researchers used a promising new material to build more functional memristors, bringing us closer to brain-like computing. Both academic and industrial laboratories are working to develop computers that operate more like the human brain. Instead of operating like a conventional, digital system,

**Category:** Digital Signal Processing

[167] **viXra:1705.0193 [pdf]**
*replaced on 2017-06-24 03:02:24*

**Authors:** Wan-Chung Hu

**Comments:** 19 Pages.

The most accepted atom model currently was proposed by Dr. Bohr and by Dr. Schrodinger and Dr. Dirac subsequently12. However, many phenomenon cannot be explained by Bohr’s atom model. He used Coulomb electric force as the centripetal force to explain the rotation of electrons around nucleus. Another very important basic forces, magnetic force and frame-dragging force (spinity), were neglected and not included in his atom model. In Schrodinger’s atom model, there are problems limiting the formation of correct atom model such as principle of uncertainty, Schrodinger’s cat, and EPR paradox345. In this study, a new determinative atom model is proposed to explain atomic phenomenon and to solve above puzzles.

**Category:** Nuclear and Atomic Physics

[166] **viXra:1705.0192 [pdf]**
*submitted on 2017-05-12 05:58:15*

**Authors:** George Rajna

**Comments:** 19 Pages.

At very high energies, the collision of massive atomic nuclei in an accelerator generates hundreds or even thousands of particles that undergo numerous interactions. [11] The first experimental result has been published from the newly upgraded Continuous Electron Beam Accelerator Facility (CEBAF) at the U.S. Department of Energy's Thomas Jefferson National Accelerator Facility. The result demonstrates the feasibility of detecting a potential new form of matter to study why quarks are never found in isolation. [10] A team of scientists currently working at the Large Hadron Collider at the European Organization for Nuclear Research (CERN) announced that it has possibly discovered the existence of a particle integral to nature in a statement on Tuesday, Dec. 15, and again on Dec.16. [9] In 2012, a proposed observation of the Higgs boson was reported at the Large Hadron Collider in CERN. The observation has puzzled the physics community, as the mass of the observed particle, 125 GeV, looks lighter than the expected energy scale, about 1 TeV. [8] 'In the new run, because of the highest-ever energies available at the LHC, we might finally create dark matter in the laboratory,' says Daniela. 'If dark matter is the lightest SUSY particle than we might discover many other SUSY particles, since SUSY predicts that every Standard Model particle has a SUSY counterpart.' [7] The problem is that there are several things the Standard Model is unable to explain, for example the dark matter that makes up a large part of the universe. Many particle physicists are therefore working on the development of new, more comprehensive models. [6] They might seem quite different, but both the Higgs boson and dark matter particles may have some similarities. The Higgs boson is thought to be the particle that gives matter its mass. And in the same vein, dark matter is thought to account for much of the 'missing mass' in galaxies in the universe. It may be that these mass-giving particles have more in common than was thought. [5] The magnetic induction creates a negative electric field, causing an electromagnetic inertia responsible for the relativistic mass change; it is the mysterious Higgs Field giving mass to the particles. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate by the diffraction patterns. The accelerating charges explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the wave particle duality and the electron's spin also, building the bridge between the Classical and Relativistic Quantum Theories. The self maintained electric potential of the accelerating charges equivalent with the General Relativity space-time curvature, and since it is true on the quantum level also, gives the base of the Quantum Gravity.

**Category:** High Energy Particle Physics

[165] **viXra:1705.0191 [pdf]**
*submitted on 2017-05-11 15:35:38*

**Authors:** Ivan L. Zhogin

**Comments:** 2 pages (English and Russian)

The special and general relativity theories suggest different descriptions of our spacetime. It is impossible that both are absolutely true, nevertheless both theories are engaged in physics - in its different parts. Such a fragmentarily worldview was inappropriate to Einstein, and he proposed another theory which in a sense (at least in the symmetry group of its equations) was a sort of synthesis of SR and GR; a few features of this theory are touched here.

**Category:** Relativity and Cosmology

[164] **viXra:1705.0190 [pdf]**
*submitted on 2017-05-11 16:44:25*

**Authors:** Evgeny A. Novikov

**Comments:** 6 Pages.

New simple and exact analytical solutions of Einstein equations of general relativity (GR) and of Qmoger (quantum modification of GR) equations are obtained. These solutions corresponds to processes with invariant density of enthalpy (energy plus pressure). Interpretation of this solutions in terms of cosmic radiation and production of massive particles, as well as comparison with cosmic data (without fitting), are presented. It is suggested, that isenthalpic processes can be relevant also to excessive radiation from Jupiter and Saturn. Similar processes potentially can be used as a new source of energy on Earth.

**Category:** Relativity and Cosmology

[163] **viXra:1705.0189 [pdf]**
*submitted on 2017-05-11 21:28:54*

**Authors:** Russell Leidich

**Comments:** 5 Pages.

Logplex codes are universal codes, that is, bitstrings which map one-to-one
to the whole numbers, regardless of the bits which follow them in memory.
The codes are dense, in the sense that there is no finite series of bits which
does not map to at least one whole number. Their asymptotic efficiency (size
out divided by size in) is one, as with Elias omega codes[1], but they have
some convient features absent in the latter:
Given whole numbers M and N. If (M<N) then (logplex(M)<logplex(N)).
This provides for more efficient searching and sorting, as such tasks can be
done without the need to allocate separate memory for the corresponding
decoded whole numbers.
For all nonzero M, M itself is encoded verbatim in the high bits of its
logplex. In all cases, the high (last) bit of a logplex is one.
Representation of all subparts of logplexes are bitwise little endian. This is in
contrast to Elias omega codes, the endianness of the subparts of which are
opposite to the expansion direction.
Finally, logplexes are scale-agnostic: there is no need to assume that (log 2 M)
has any particular maximum value. This feature stems from their recursive
structure, which is analogous to that of Elias omega codes.

**Category:** Digital Signal Processing

[162] **viXra:1705.0188 [pdf]**
*submitted on 2017-05-11 21:39:13*

**Authors:** Russell Leidich

**Comments:** 28 Pages.

Claude Shannon[1] devised a way to quantify the information entropy[2] of a finite integer set, given the probabilities of finding each integer in the set. Information entropy, hereinafter simply "entropy", refers to the number of
bits required to encode some such set in a given numerical base (usually binary). Unfortunately, his formula for the "Shannon entropy" seems to have been widely misappropriated as a means by which to measure the entropy of such sets by supplanting the probability coefficients (which are generally unknowable) with the normalized frequencies of the integers as they actually
occur in the set. This practice is so common that Shannon entropy is often defined in precisely this manner, and indeed this is how we define it here. However, the inaccuracy induced by this compromise may lead to erroneous conclusions, especially when very short or faint signals are concerned. To make matters worse, the numerical behavior of Shannon entropy formula is rather unstable over large sets, where otherwise it would be more accurate.
Herein we introduce the concept of agnentropy, short for "agnostic entropy", in the sense of an entropy metric which begins with almost no assumptions about the set under analysis. (Technically, it's a "divergence" -- essentially a
Kullback-Leibler divergence[3] without the implicit singularies -- because it fails the triangle inequality. We refer to it as a "metric" only in the qualitative sense that it measures something.) This stands in stark contrast to the
(compromised) Shannon entropy, which presupposes that the frequencies of integers within a given set are already known. In addition to being more
accurate when used appropriately, agnentropy is also more numerically stable and faster to compute than Shannon entropy.
To be precise, Shannon entropy does not measure the number of bits in an invertibly compressed code. It is, more accurately, an underestimation of that value. Unfortunately, the margin of underestimation is not straightforwardly computable, and has a size O(Z), where Z is the number of unique integers in the set, assuming that said integers are of predetermined maximum size. By contrast, agnentropy underestimates that bit count by no more than 2, plus the size of 2 logplexes. (Logplexes are universal (affine) codes introduced in [8].) In practice, this overhead amounts to tens of bits, as opposed to potentially thousands of bits for Shannon. This difference has meaningful ramifications for the optimization of both lossless and lossy compression algos.

**Category:** Digital Signal Processing

[161] **viXra:1705.0187 [pdf]**
*replaced on 2017-10-24 23:15:07*

**Authors:** Russell Leidich

**Comments:** 37 Pages.

We have at our disposal a wide variety of discrete transforms for the discovery of "interesting" signals in discrete data sets in any number of dimensions, which are of particular utility when the default assumption is that the set is mundane. SETI, the Search for Extraterrestrial Intelligence, is the archetypical case, although problems in drug discovery, malware detection, financial arbitrage, geologic exploration, forensic analysis, and other diverse fields are perpetual clients of such tools. Fundamentally, these include the Fourier, wavelet, curvelet, wave atom, contourlet, brushlet, etc. transforms which have churned out of math departments with increasing frequency since the days of Joseph Fourier. A mountain of optimized applications has been built on top of them, for example the Fastest Fourier Transform in the West[1] and the Wave Atom Toolbox[2].
Such transforms excel at discovering particular classes of signals. So much so that the return on investment in new math would appear to be approachingzero. What's missing, however, is efficiency: the question must be asked as to when such transforms are computationally justifiable.
Herein we investigate a preprocessing technique, abstractly known as an "entropy transform", which, in a wide variety of practical applications, can discern in essentially real time whether or not an "interesting" signal exists within a particular data set. (Entropy transforms say nothing as to the nature of the signal, but merely how interesting a particular subset of the data appears to be.) Entropy transforms have the added advantage that they can also be tuned to behave as crude classifiers – not as good as their deep learning counterparts, but requiring orders of magnitude less processing power. In applications where identifying many targets with moderate accuracy is more important than identifying a few targets with excellent accuracy, entropy transforms could bridge the gap to product viability.
It would be fair to say that in the realm of signal detection, discrete transforms should be the tool of choice because they tend to produce the most accurate and well characterized results. But processor power and execution time are not free! Particularly when, as in the case of SETI, the bottleneck is the rate at which newly acquired data can be processed, a more productive approach would be use to cheap but reasonably accurate O(N) transforms to filter out all but the most surprising subsets of the data. This would reserve processing capacity for those rare weird cases more deserving of closer inspection.
I published Agnentro[3], an open-source toolkit for signal search and comparison. The reason, first and foremost, was to support these broad and rather unintuitive assertions with numerical evidence. The goal of this paper is to formalize the underlying math.

**Category:** Digital Signal Processing

[160] **viXra:1705.0186 [pdf]**
*submitted on 2017-05-11 22:43:22*

**Authors:** Yanming Wei

**Comments:** 10 pages, 6 figures. DOI: 10.13140/RG.2.2.33349.45285

Just like the webs of duck feet or bird wings, human beings have struggled for many centuries to experiment flapping-wing aircrafts or ornithopters that shall vertically takeoff, but still failed to commercialize it. Until nowadays, only rotary propeller driven helicopters have achieved great success. Now I propose a flapping umbrella driven ornithopter that is powered by pulse power supply. Once upon commercialization of such inventions, humankind will benefit in many aspects, e.g. affordable personal aerial commuting, remote internet service, goods shipping, etc.

**Category:** Classical Physics

[159] **viXra:1705.0185 [pdf]**
*submitted on 2017-05-12 02:55:43*

**Authors:** Bill Gaede

**Comments:** 13 Pages.

Physics has evolved from an attempt by ancient researchers to understand the workings of their immediate surroundings to a body of mathematical descriptions and paradoxical physical interpretations. We have today no rational explanation for the simplest of systems and phenomena, for instance, how a magnet physically attracts another from a distance or by what physical means the Earth prevents the Moon from leaving the Solar System. Not one mathematical physicist can explain in a logical manner why a pen falls to the floor rather than to the ceiling. The equations suggest that ‘mass attracts mass’ or that ‘north attracts south’, but these are mere descriptions. They give us no insight as to the physical mechanisms underlying such phenomena. We trace these shortcomings to the nature of the scientific method inherited primarily from 17th Century researchers. We argue, in essence, that the current version of the scientific method is divorced from authentic Science. Here we propose an alternative – henceforth known as the Rational Scientific Method (capitalized to distinguish it from what is currently regarded as such) – and outline the steps necessary to present rational explanations for physical phenomena.

**Category:** History and Philosophy of Physics

[158] **viXra:1705.0184 [pdf]**
*submitted on 2017-05-12 03:38:45*

**Authors:** George Rajna

**Comments:** 38 Pages.

An international team of scientists has produced the world's first computerised tomography (CT) images of biological tissue using protons – a momentous step towards improving the quality and feasibility of Proton Therapy for cancer sufferers around the world. [21] Researchers at Columbia University have made a significant step toward breaking the so-called "color barrier" of light microscopy for biological systems, allowing for much more comprehensive, system-wide labeling and imaging of a greater number of biomolecules in living cells and tissues than is currently attainable. [20] Scientists around the Nobel laureate Stefan Hell at the Max Planck Institute for Biophysical Chemistry in Göttingen have now achieved what was for a long time considered impossible – they have developed a new fluorescence microscope, called MINFLUX, allowing, for the first time, to optically separate molecules, which are only nanometers (one millionth of a millimeter) apart from each other. [19] Dipole orientation provides new dimension in super-resolution microscopy [18] Fluorescence is an incredibly useful tool for experimental biology and it just got easier to tap into, thanks to the work of a group of University of Chicago researchers. [17] Molecules that change colour can be used to follow in real-time how bacteria form a protective biofilm around themselves. This new method, which has been developed in collaboration between researchers at Linköping University and Karolinska Institutet in Sweden, may in the future become significant both in medical care and the food industry, where bacterial biofilms are a problem. [16] Researchers led by Carnegie Mellon University physicist Markus Deserno and University of Konstanz (Germany) chemist Christine Peter have developed a computer simulation that crushes viral capsids. By allowing researchers to see how the tough shells break apart, the simulation provides a computational window for looking at how viruses and proteins assemble. [15] IBM scientists have developed a new lab-on-a-chip technology that can, for the first time, separate biological particles at the nanoscale and could enable physicians to detect diseases such as cancer before symptoms appear. [14] Scientists work toward storing digital information in DNA. [13]

**Category:** Physics of Biology

[157] **viXra:1705.0182 [pdf]**
*submitted on 2017-05-11 05:51:50*

**Authors:** Wan-Chung Hu

**Comments:** 11 Pages.

Earthquake is thought to be due to plate tectonic movement. However, this theory has several fetal defects which fail to lead successful earthquake prediction. First, sudden outset of earthquake in a certain point cannot be due to chronic continental drift in large scale. Second, continental drift is only proved between South America and Africa which cannot explain all the mechanisms of earthquakes. Third, plate movement theory cannot explain huge intraplate earthquakes. Fourth, the old crusts sunk into toughs after the generation of new crusts in mid-ocean ridge. Thus, it cannot be the driving crusting force for earthquakes. Fifth, we experience a first longitudinal P wave followed by a transverse S wave during earthquakes. If plate movement causes earthquakes, we should experience a transverse S wave first and majorly. Here, I propose that earthquakes are actually the abrupt release of electromagnetic radiation from the faults. There is high temperature which can generate radiation from inside earth, especially accumulation for several decades. This new theory can explain why earthquakes likely happen in hotspots such as Hawaii or in peri-pacific bands because of crust fissures and toughs. This new theory can explain possible huge intra-plate earthquakes. It can also explain the sunquakes or moonquakes which cannot be explained by plate movement theory. It will also explain why super-moon trends to induce earthquakes. The mechanism of earthquake is the gravity acceleration produced by outward light because light is electromagnetic wave as well as gravity wave. This mechanism can also explain earthquake light and ionosphere anomaly as well as EM field anomaly during earthquakes. This new theory will lead a successful earthquake prediction.

**Category:** Geophysics

[156] **viXra:1705.0181 [pdf]**
*submitted on 2017-05-10 13:18:27*

**Authors:** George Rajna

**Comments:** 27 Pages.

National Institute of Standards and Technology (NIST) physicists have solved the seemingly intractable puzzle of how to control the quantum properties of individual charged molecules, or molecular ions. [20] Researchers at Sandia National Laboratories have developed new mathematical techniques to advance the study of molecules at the quantum level. [19] Correlation functions are often employed to quantify the relationships among interdependent variables or sets of data. A few years ago, two researchers proposed a property-testing problem involving Forrelation for studying the query complexity of quantum devices. [18] A team of researchers from Australia and the UK have developed a new theoretical framework to identify computations that occupy the 'quantum frontier'—the boundary at which problems become impossible for today's computers and can only be solved by a quantum computer. [17] Scientists at the University of Sussex have invented a groundbreaking new method that puts the construction of large-scale quantum computers within reach of current technology. [16] Physicists at the University of Bath have developed a technique to more reliably produce single photons that can be imprinted with quantum information. [15] Now a researcher and his team at Tyndall National Institute in Cork have made a 'quantum leap' by developing a technical step that could enable the use of quantum computers sooner than expected. [14] A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy's Oak Ridge National Laboratory. [13] A source of single photons that meets three important criteria for use in quantum-information systems has been unveiled in China by an international team of physicists. Based on a quantum dot, the device is an efficient source of photons that emerge as solo particles that are indistinguishable from each other. The researchers are now trying to use the source to create a quantum computer based on "boson sampling". [11]

**Category:** Quantum Physics

[155] **viXra:1705.0178 [pdf]**
*submitted on 2017-05-10 20:04:38*

**Authors:** Jeffrey Joseph Wolynski

**Comments:** 2 Pages. 1 illustrative graph

In stellar metamorphosis theory it is noted that after stars form their global magnetic fields, they decrease in strength as the star evolves. The global magnetic field is not limitless energy, but is directly proportional to the amount of mechanical motion in the interior of the star. This means the bigger the global field, the more active/younger the star.

**Category:** Astrophysics

[154] **viXra:1705.0177 [pdf]**
*submitted on 2017-05-10 22:54:08*

**Authors:** Miguel A. Sanchez-Rey

**Comments:** 7 Pages.

The super-planetary state and the self-management of democratic political economy.

**Category:** Social Science

[153] **viXra:1705.0176 [pdf]**
*replaced on 2017-05-29 09:54:35*

**Authors:** Sylwester Kornowski

**Comments:** 5 Pages.

Can we guess the initial conditions for the Theory of Everything (ToE)? We understand such initial conditions as a set of all parameters, initial symmetries, and initial equations. Initial symmetries and initial equations can point possible phase transitions which can lead to additional symmetries and additional equations called here the additional conditions. Such additional conditions result from initial conditions so they do not decrease consistency of theory. On the other hand, appearing anomalies in a theory that cannot be explained within initial and additional conditions, always lead to new/free parameters. Free parameters need ad hoc hypotheses (i.e. some corrections that do not result from initial and additional conditions) which always weaken the theories. Elimination of ad-hoc/free parameters by increasing number of initial conditions causes Occam’s razor to be a determinant of the consistency of theories describing the same phenomena. The Occam’s razor is defined as follows: “Among competing hypotheses, the one with the fewest assumptions should be selected” [1]. It means that consistency of a theory can be defined as the inverse of the number which is the sum of all parameters, initial symmetries and initial equations (the sum of elements of the three different groups of initial conditions). New symmetries and new equations, which in a natural way appear on higher levels of ToE (the Standard Model (SM) and General Relativity (GR) are the higher levels of ToE), if we know the lowest levels of ToE, do not decrease the consistency of the theory. Authors of theories add the ad hoc hypotheses to prevent them from being falsified. Such non-scientific method causes that theories become more and more complex so their consistency is lower and lower. In physics, naturalness means that the dimensionless ratios between parameters take values of order 1. Parameters varying by many orders of magnitude need so called fine-tuning symmetries. It suggests that fine-tuned theories should be more complex i.e. their consistency should be lower. But Nature shows that it is the vice versa. It leads to conclusion that fine-tuned theories are closer to ToE. Here we guessed the initial conditions for ToE, we explained why consistency of presented here ToE is highest and why it is the fine-tuned theory. The consistency factor of presented here ToE is 1/(7+5+4)=0.0625 and it is the highest possible value for ToE-like theories. Consistency factor of SM is much lower so it is the incomplete theory sometimes leading to incorrect results.

**Category:** Quantum Gravity and String Theory

[152] **viXra:1705.0175 [pdf]**
*submitted on 2017-05-11 00:49:50*

**Authors:** Zuodong Sun

**Comments:** 11 Pages.

This is a new idea that based on effective treatment of Parkinson's disease and Alzheimer's disease with transcranial magnetoelectric stimulation technology, it can understand a hypothesis about voltage-gated Ca2+ channels is the best target for activation by physical means, basic content：Parkinson's disease , Alzheimer's disease etc. neuronal degeneration diseases, that closely related to physical-gated ion channels, which can be treated with physical means, activating neurotransmitters-energic neurons plays key roles in the treatment, and voltage-gated Ca2+ channels is the best target for physical means, the purpose is to induce Ca2+ inflowing and triggers neuronal axon terminals synaptic vesicles releasing neurotransmitters. The theory of brain cell activation sets forth the principle, method and purpose of treatment of the physical gated ion channel diseases such as Alzheimer's disease, Parkinson's disease and other neural degeneration diseases, and indicates that the attempt to treat these diseases using pharmaceutical and chemical approaches could shake our confidence in conquering the diseases, and the application of physical approaches or combined application of physical and chemical approaches in the treatment of some major encephalopathy may be our main research direction in the future.

**Category:** Physics of Biology

[151] **viXra:1705.0174 [pdf]**
*submitted on 2017-05-10 11:02:05*

**Authors:** George Rajna

**Comments:** 30 Pages.

Einstein's "spooky action at a distance" persists even at high accelerations, researchers of the Austrian Academy of Sciences and the University of Vienna were able to show in a new experiment. [19] Researchers have devised an improved method for checking whether two particles are entangled. [18] A group of researchers from the Faculty of Physics at the University of Warsaw has shed new light on the famous paradox of Einstein, Podolsky and Rosen after 80 years. They created a multidimensional entangled state of a single photon and a trillion hot rubidium atoms, and stored this hybrid entanglement in the laboratory for several microseconds. [17] Members of the Faculty of Physics at the Lomonosov Moscow State University have elaborated a new technique for creating entangled photon states. [16] Quantum mechanics, with its counter-intuitive rules for describing the behavior of tiny particles like photons and atoms, holds great promise for profound advances in the security and speed of how we communicate and compute. [15] University of Oregon physicists have combined light and sound to control electron states in an atom-like system, providing a new tool in efforts to move toward quantum-computing systems. [14] Researchers from the Institute for Quantum Computing at the University of Waterloo and the National Research Council of Canada (NRC) have, for the first time, converted the color and bandwidth of ultrafast single photons using a room-temperature quantum memory in diamond. [13] One promising approach for scalable quantum computing is to use an all-optical architecture, in which the qubits are represented by photons and manipulated by mirrors and beam splitters. So far, researchers have demonstrated this method, called Linear Optical Quantum Computing, on a very small scale by performing operations using just a few photons. In an attempt to scale up this method to larger numbers of photons, researchers in a new study have developed a way to fully integrate single-photon sources inside optical circuits, creating integrated quantum circuits that may allow for scalable optical quantum computation. [12] Spin-momentum locking might be applied to spin photonics, which could hypothetically harness the spin of photons in devices and circuits. Whereas microchips use electrons to perform computations and process information,

**Category:** Quantum Physics

[150] **viXra:1705.0173 [pdf]**
*replaced on 2017-05-12 13:18:20*

**Authors:** Ilija Barukčić

**Comments:** 24 pages. Copyright © 2017 by Ilija Barukčić, Jever, Germany. Published by:

The division of zero by zero turns out to be a long lasting and not ending puzzle in mathematics and physics. An end of this long discussion is not in sight. In particular zero divided by zero is treated as indeterminate thus that a result cannot be found out. It is the purpose of this publication to solve the problem of the division of zero by zero while relying on the general validity of classical logic. According to classical logic, zero divided by zero is one.

**Category:** Set Theory and Logic

[149] **viXra:1705.0172 [pdf]**
*submitted on 2017-05-10 12:43:05*

**Authors:** George Rajna

**Comments:** 29 Pages.

Microsoft on Wednesday unveiled new tools intended to democratize artificial intelligence by enabling machine smarts to be built into software from smartphone games to factory floors. [17] The closer we can get a machine translation to be on par with expert human translation, the happier lots of people struggling with translations will be. [16] Researchers have created a large, open source database to support the development of robot activities based on natural language input. [15] A pair of physicists with ETH Zurich has developed a way to use an artificial neural network to characterize the wave function of a quantum many-body system. [14] A team of researchers at Google's DeepMind Technologies has been working on a means to increase the capabilities of computers by combining aspects of data processing and artificial intelligence and have come up with what they are calling a differentiable neural computer (DNC.) In their paper published in the journal Nature, they describe the work they are doing and where they believe it is headed. To make the work more accessible to the public team members, Alexander Graves and Greg Wayne have posted an explanatory page on the DeepMind website. [13] Nobody understands why deep neural networks are so good at solving complex problems. Now physicists say the secret is buried in the laws of physics. [12] A team of researchers working at the University of California (and one from Stony Brook University) has for the first time created a neural-network chip that was built using just memristors. In their paper published in the journal Nature, the team describes how they built their chip and what capabilities it has. [11] A team of researchers used a promising new material to build more functional memristors, bringing us closer to brain-like computing. Both academic and industrial laboratories are working to develop computers that operate more like the human brain. Instead of operating like a conventional, digital system, these new devices could potentially function more like a network of neurons. [10] Cambridge Quantum Computing Limited (CQCL) has built a new Fastest Operating System aimed at running the futuristic superfast quantum computers. [9]

**Category:** Artificial Intelligence

[148] **viXra:1705.0171 [pdf]**
*submitted on 2017-05-10 08:05:54*

**Authors:** George Rajna

**Comments:** 30 Pages.

Jarvis Loh, Gan Chee Kwan and Khoo Khoong Hong from the Agency for Science, Technology and Research (A*STAR) Institute of High Performance Computing, Singapore, have modeled these minute spin spirals in nanoscopic crystal layers. [18]
Some of the world's leading technology companies are trying to build massive quantum computers that rely on materials super-cooled to near absolute zero, the theoretical temperature at which atoms would cease to move. [17]
While technologies that currently run on classical computers, such as Watson, can help find patterns and insights buried in vast amounts of existing data, quantum computers will deliver solutions to important problems where patterns cannot be seen because the data doesn't exist and the possibilities that you need to explore to get to the answer are too enormous to ever be processed by classical computers. [16]
Through a collaboration between the University of Calgary, The City of Calgary and researchers in the United States, a group of physicists led by Wolfgang Tittel, professor in the Department of Physics and Astronomy at the University of Calgary have successfully demonstrated teleportation of a photon (an elementary particle of light) over a straight-line distance of six kilometers using The City of Calgary's fiber optic cable infrastructure. [15]
Optical quantum technologies are based on the interactions of atoms and photons at the single-particle level, and so require sources of single photons that are highly indistinguishable – that is, as identical as possible. Current single-photon sources using semiconductor quantum dots inserted into photonic structures produce photons that are ultrabright but have limited indistinguishability due to charge noise, which results in a fluctuating electric field. [14]
A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy's Oak Ridge National Laboratory. [13]
A source of single photons that meets three important criteria for use in quantum-information systems has been unveiled in China by an international team of physicists. Based on a quantum dot, the device is an efficient source of photons that emerge as solo particles that are indistinguishable from each other. The researchers are now trying to use the source to create a quantum computer based on "boson sampling". [11]
With the help of a semiconductor quantum dot, physicists at the University of Basel have developed a new type of light source that emits single photons. For the first time, the researchers have managed to create a stream of identical photons. [10]
Optical photons would be ideal carriers to transfer quantum information over large distances. Researchers envisage a network where information is processed in certain nodes and transferred between them via photons. [9]
While physicists are continually looking for ways to unify the theory of relativity, which describes large-scale phenomena, with quantum theory, which describes small-scale phenomena, computer scientists are searching for technologies to build the quantum computer using Quantum Information.
In August 2013, the achievement of "fully deterministic" quantum teleportation, using a hybrid technique, was reported. On 29 May 2014, scientists announced a reliable way of transferring data by quantum teleportation. Quantum teleportation of data had been done before but with highly unreliable methods.
The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron’s spin also, building the Bridge between the Classical and Quantum Theories.
The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry.
The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to build the Quantum Computer with the help of Quantum Information.

**Category:** Quantum Physics

[147] **viXra:1705.0169 [pdf]**
*submitted on 2017-05-10 09:44:23*

**Authors:** George Rajna

**Comments:** 13 Pages.

Physicists have theoretically shown that a superconducting current of electrons can be induced to flow by a new kind of transport mechanism: the potential flow of information. [27] This paper explains the magnetic effect of the superconductive current from the observed effects of the accelerating electrons, causing naturally the experienced changes of the electric field potential along the electric wire. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the wave particle duality and the electron's spin also, building the bridge between the Classical and Quantum Theories. The changing acceleration of the electrons explains the created negative electric field of the magnetic induction, the Higgs Field, the changing Relativistic Mass and the Gravitational Force, giving a Unified Theory of the physical forces. Taking into account the Planck Distribution Law of the electromagnetic oscillators also, we can explain the electron/proton mass rate and the Weak and Strong Interactions. Since the superconductivity is basically a quantum mechanical phenomenon and some entangled particles give this opportunity to specific matters, like Cooper Pairs or other entanglements, as strongly correlated materials and Exciton-mediated electron pairing, we can say that the secret of superconductivity is the quantum entanglement.

**Category:** Quantum Physics

[146] **viXra:1705.0168 [pdf]**
*submitted on 2017-05-10 05:52:48*

**Authors:** Solomon Budnik

**Comments:** 1 Page. CREATION OF LEVITATING MATERIALS AND DEVICES

Lunar dust is levitated from the surface by powerful electrostatic charges generated by interplanetary radiation swirling across the landscape. In fact, electrical charges might even produce dust 'fountains'. As the rising Sun's light and radiation sweeps across the lunar surface it could generate large positive charges, enough to levitate dust particles of active metals a mile high, until they drop back, only to get levitated again like a pulsing fountain

**Category:** Quantum Physics

[145] **viXra:1705.0167 [pdf]**
*submitted on 2017-05-10 05:55:08*

**Authors:** Fang Zhou

**Comments:** 10 pages in Chinese

The Galilean Transformation exhibits a composition of observers’ observation vectors.The author of this article presents the simplest deduction for space-time transformation objectively existing in motion observation under the circumstances of limited light velocity, utilizing the inherent exiting property of Galilean Transformation.

**Category:** Relativity and Cosmology

[144] **viXra:1705.0166 [pdf]**
*replaced on 2018-01-25 01:56:40*

**Authors:** Arieh Sher

**Comments:** 36 Pages.

The Pivot theory presents a new theory regarding the structure of the Universe. It postulates that the Universe is composed of a massive spinning body, the Pivot, and a ring of a finite visible Universe that is orbiting it. The Pivot can be described by two theories, that currently are not unified. From the GR point of view, the Pivot is a Kerr black hole. From the quantum physics point of view, the Pivot is a super dense hadron. It is shown that by combining the two theories, QM and GR, the gravitational constant G can be calculated based on other fundamental physical constants.
The Pivot theory, described in this article, consists of two parts. The first relates to the questions of how did it all begin and how did the Universe has evolved into the Pivot structure. This part is not fully addressed. The Pivot theory is based on the theory of the primeval hadron. Using the primeval hadron’s theory, the structure of the Universe is explained and its size is calculated. It is shown that the primeval hadron theory can answer questions relating to time, namely, what happened at the exact moment of the Universe creation i.e., t=0 and, more importantly, what happened before t=0.
The second part relates to the current structure of the Universe. The structure of the Pivot Universe is in accord with many known cosmological observations, to name some: The flattened rotation curve in Spiral Galaxies, the Spiral shape of Galaxies, Redshift of Galaxies etc.

**Category:** Relativity and Cosmology

[143] **viXra:1705.0165 [pdf]**
*replaced on 2018-02-14 22:45:03*

**Authors:** Nicholas R. Wright

**Comments:** 6 Pages. Replaced "Pareto" with "Pareto Improvement"

We prove the Navier-Stokes equations, by means of the Metabolic Theory of Ecology and the Rule of 72. Macroecological theories are proof to the Navier-Stokes equations. A solution could be found using Kleiber’s Law. Measurement is possible through the heat calorie. A Pareto Improvement exists within the Navier-Stokes equations. This is done by superposing dust solutions onto fluid solutions. In summary, the Navier-Stokes equations require a theoretical solution. The Metabolic Theory of Ecology, along with Kleiber’s Law, form a theory by such standards.

**Category:** Functions and Analysis

[142] **viXra:1705.0164 [pdf]**
*submitted on 2017-05-09 18:02:58*

**Authors:** Ahmed Ibrahim Mohamed Ahmed, Mohamed Yehia Zakaria Arafa, Shady Essam Ramzy Taodharos

**Comments:** 7 Pages, E-mail: 15004@stemegypt.edu.eg

Every developed country depends on the industry as the main factor of its economy. Lack of exports, depression in both the general economy and the value of the currency are consequences of neglecting the industry. All countries work on increasing the efficiency of their industries by whether working on the input, the output, the cost or the time of the process. Plastic industry is considered one of the most important industries because plastic is an important factor in the making of many useful products such as sheets, tubes, rods, slabs, building blocks and domestic products. Making bio-plastic from banana peels instead of the traditional petroleum-based plastic is believed to be a successful solution to increase the efficiency of plastic industry. The solution produces the same amount of plastic with higher efficiency and durability and with a little cost in less time than normal plastic, so it meets the design requirements of any successful solution which are production, efficiency, and cost. The prototype of this project represents the process of manufacturing bio-plastic from banana peels and tests the durability and the efficiency of the plastic produced. The results showed that the plastic produced could bear the weight one and a half time more than petroleum-based plastic so it is suitable for being used in the making of traditional plastic products. In conclusion, test results showed that this project is the perfect solution to develop the plastic industry process.

**Category:** Chemistry

[141] **viXra:1705.0163 [pdf]**
*submitted on 2017-05-09 20:35:43*

**Authors:** Terubumi Honjou

**Comments:** 4 Pages.

Photon group of the light of the sun that arrived at the space exploration satellite pioneer triggers gravity.
I maintain a star around the Milky Way where photon group to surround the Milky Way turns by gravity.
In the wave packet of the material wave called the pilot wave, a photon and a gravity child should exist.
A photon and a gravity child are left out of velocity of light C for the different wave packet.
In the quantum mechanics, the electromagnetic willpower is told to act by the exchange of the photon.
It is thought that the gravity acts by the exchange of the gravity child.
I think that gravity and electromagnetic willpower act by the same photon by the hypothesis of the elementary particle pulsation principle.
I understand that Mitsuko of the minus number particle trip is a gravity child.
The space formed by a minus number particle trip of the photon group that the outer space is full of is a distortion of the space.
I meet the general theory of relativity that assumed a distortion of the space gravity.
The electromagnetic power and gravity are the table of a photon and back.
5 wave packets are equivalent to 5 cycles of pulsation.
The wave packet is rotted, but the pulsation isn`t rotted.

**Category:** Astrophysics

[140] **viXra:1705.0162 [pdf]**
*replaced on 2017-10-14 20:04:39*

**Authors:** Chuanli Chen

**Comments:** 29 Pages.

Many theories claim to explain economic crises and periodic fluctuations in the
economy, however, most of them are imperfect at explaining many phenomena
throughout history. In this paper, I put forward a new theory and model that explains
economic crises and periodic fluctuations in the economy as well as policies for
avoiding economic crises. This paper analyzes the direction of currency flow in the
free market, and explains why the market is constantly whirling. It also discusses the
relationship between money flow speed and GDP, explaining why accelerating the
speed of money flow in the market can make a country rich.

**Category:** Economics and Finance

[139] **viXra:1705.0161 [pdf]**
*submitted on 2017-05-10 01:02:13*

**Authors:** Stephen J Crothers

**Comments:** 4 Pages.

According to the Theory of Relativity the Universe is an amalgam of time and space containing matter; a four-dimensional spacetime continuum alleged as an analytic generalisation of the Theorem of Pythagoras from three dimensions. Spacetime is said to be curved by matter and undergoes rippling due to gravitational waves travelling at the speed
of light. Points in spacetime are called 'events'. The distance between two events is called the spacetime interval, which is manifest as a distance formula, often called a metric or line-element, in terms of 'coordinates'. However, Minkowski-Einstein spacetime is not actually a
four-dimensional continuum because it is self-referential via the speed of light.

**Category:** Relativity and Cosmology

[138] **viXra:1705.0160 [pdf]**
*submitted on 2017-05-10 01:14:50*

**Authors:** Evgeny A. Novikov

**Comments:** 6 Pages.

New simple and exact analytical solutions of Einstein equations of general relativity (GR) and of Qmoger (quantum modification of GR) equations are obtained. These solutions corresponds to processes with invariant density of enthalpy (energy plus pressure). Interpretation of this solutions in terms of cosmic radiation and production of massive particles, as well as comparison with cosmic data (without fitting), are presented. It is suggested, that isenthalpic processes can be relevant also to excessive radiation from Jupiter and Saturn. Similar processes potentially can be used as a new source of energy on Earth.

**Category:** Astrophysics

[137] **viXra:1705.0159 [pdf]**
*submitted on 2017-05-10 03:04:07*

**Authors:** George Rajna

**Comments:** 30 Pages.

Researchers from the University of Illinois at Urbana-Champaign have demonstrated a new level of optical isolation necessary to advance on-chip optical signal processing. The technique involving light-sound interaction can be implemented in nearly any photonic foundry process and can significantly impact optical computing and communication systems. [22]
City College of New York researchers have now demonstrated a new class of artificial media called photonic hypercrystals that can control light-matter interaction in unprecedented ways. [21]
Experiments at the Institute of Physical Chemistry of the Polish Academy of Sciences in Warsaw prove that chemistry is also a suitable basis for storing information. The chemical bit, or 'chit,' is a simple arrangement of three droplets in contact with each other, in which oscillatory reactions occur. [20]
Researchers at Sandia National Laboratories have developed new mathematical techniques to advance the study of molecules at the quantum level. [19]
Correlation functions are often employed to quantify the relationships among interdependent variables or sets of data. A few years ago, two researchers proposed a property-testing problem involving Forrelation for studying the query complexity of quantum devices. [18]
A team of researchers from Australia and the UK have developed a new theoretical framework to identify computations that occupy the 'quantum frontier'—the boundary at which problems become impossible for today's computers and can only be solved by a quantum computer. [17]
Scientists at the University of Sussex have invented a ground-breaking new method that puts the construction of large-scale quantum computers within reach of current technology. [16]
Physicists at the University of Bath have developed a technique to more reliably produce single photons that can be imprinted with quantum information. [15]
Now a researcher and his team at Tyndall National Institute in Cork have made a 'quantum leap' by developing a technical step that could enable the use of quantum computers sooner than expected. [14]
A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy's Oak Ridge National Laboratory. [13]
A source of single photons that meets three important criteria for use in quantum-information systems has been unveiled in China by an international team of physicists. Based on a quantum dot, the device is an efficient source of photons that emerge as solo particles that are indistinguishable from each other. The researchers are now trying to use the source to create a quantum computer based on "boson sampling". [11]
With the help of a semiconductor quantum dot, physicists at the University of Basel have developed a new type of light source that emits single photons. For the first time, the researchers have managed to create a stream of identical photons. [10]
Optical photons would be ideal carriers to transfer quantum information over large distances. Researchers envisage a network where information is processed in certain nodes and transferred between them via photons. [9]
While physicists are continually looking for ways to unify the theory of relativity, which describes large-scale phenomena, with quantum theory, which describes small-scale phenomena, computer scientists are searching for technologies to build the quantum computer using Quantum Information.
In August 2013, the achievement of "fully deterministic" quantum teleportation, using a hybrid technique, was reported. On 29 May 2014, scientists announced a reliable way of transferring data by quantum teleportation. Quantum teleportation of data had been done before but with highly unreliable methods.
The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron’s spin also, building the Bridge between the Classical and Quantum Theories.
The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry.
The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to build the Quantum Computer with the help of Quantum Information.

**Category:** Quantum Physics

[136] **viXra:1705.0157 [pdf]**
*submitted on 2017-05-10 04:48:19*

**Authors:** Mr. Bhargabjyoti Saikia1, Rupaban Subadar†2

**Comments:** 12 Pages.

Analysis of an Optimum Power and Rate Adaptation (OPRA) technique has been carried
out for Multilevel-Quadrature Amplitude Modulation (M-QAM) over Nakagami-m ?at fading
channels considering an imperfect channel estimation at the receiver side. The optimal solution
has been derived for a continuous adaptation, which is a specific bound function and not
possible to express in close mathematical form. Therefore, a sub-optimal solution is derived
for the continuous adaptation and it has been observed that it tends to the optimum solution
as the correlation coefficient between the true channel gain and its estimation tends to one. It
has been observed that the receiver performance degrades with an increase in estimation error.

**Category:** Digital Signal Processing

[135] **viXra:1705.0155 [pdf]**
*submitted on 2017-05-09 11:33:22*

**Authors:** S.M.Hosseini, M.I.Kendrick

**Comments:** 23 Pages.

Abstract
The law that states:
‘’Matter can be converted to Energy and vice versa’’
Needs also incorporate that:
‘’Anti-Matter can be converted to Anti-Energy and vice versa’’
With this assumption, the Universe can be modelled with precisely equal amount of Energy and Anti-Energy prior to the Big-Bang which can cause the formation of a single particle that would be the building block of the entire Universe from matter to forces of nature in different manifestation.
Hence could be calculated and identified as the quantum gravity.
This particle has the values of Planck [1] mass, time frequency and distance.
In this paper the calculation and the different manifestation of this particle has shown to be precisely in agreement with theory of the Hot Big-Bang and in accordance with the observations in particle physics, cosmology and the laws of nature.
The quantum gravity particle is the force behind the expansion of the universe [2], the unification of the forces of nature [3] and Wave, Particle and Luminiferous aether trinity of the light [4].
This particle is made of energy encapsulating precisely equal amount of anti-energy in the form of perfect sphere or the most symmetrical shape in the Universe.

**Category:** Quantum Gravity and String Theory

[134] **viXra:1705.0154 [pdf]**
*submitted on 2017-05-09 12:16:42*

**Authors:** Mesut Kavak

**Comments:** 1 Page.

This works aims to bring a simple solution to the Riemann Hypothesis over the Lagarias Transformation.

**Category:** Number Theory

[133] **viXra:1705.0153 [pdf]**
*replaced on 2017-05-16 05:38:01*

**Authors:** Leo Vuyk

**Comments:** 34 Pages.

According to Function Follows Form Theory,
Hawking mathematics of a black hole horizon should be interpreted differently.
Too weird to be true?
My conclusion is that
Stephen Hawking is still too anxious to admit that the splitting of positive mass ( going out) and negative mass (going in) at the black hole horizon should lead to violation of the second law of thermodynamics , by entropy decrease!!
So, the Hawking mathematics of a black hole horizon should be interpreted differently.
All Fermions (with positive mass) should be repelled at one of the fermion repelling horizons with positive or negative charge!
As a result, each black hole is assumed to have a globular shell of positron and proton based plasma at the inside and a negative electron based plasma shell outside this positive shell.
The ingoing negative mass (suggested by Hawking) could be interpreted as the anti material entangled copy symmetric shadow partners popping up into distant entangled anti material copy universes and the origin of the interference effect in a one photon double split experiment as suggested by David Bohm..
If we include existence of small interference black holes (or Quantum Knots) then Ball lightning and micro comets ( including Comets and sunspots) are to be explained as micro black holes violating the second law of thermodynamics.
The FORM and MICROSTRUCTURE of elementary particles, is supposed to be the origin of FUNCTIONAL differences between Higgs- Graviton- Photon- and Fermion particles. As a consequence, a NEW splitting, accelerating and pairing MASSLESS BLACK HOLE, able to convert vacuum energy (ZPE) into real energy by entropy decrease, seems to be able to explain quick Galaxy- and Star formation, down to Sunspots, (Micro) Comets, Lightning bolts, Sprites and Elves, Sprite Fireballs and Ball Lightning. Recently the NASA-SOHO satellite photos showed clear evidence of multiple hotspots created at the solar surface by interference . I assume that the majority of the hotspots can be compared with Micro Comet- or fireball phenomena related to Sprites

**Category:** Astrophysics

[132] **viXra:1705.0152 [pdf]**
*submitted on 2017-05-09 12:39:34*

**Authors:** Edgar Valdebenito

**Comments:** 3 Pages.

This note presents some formulas related with ahmed's integral.

**Category:** Number Theory

[131] **viXra:1705.0151 [pdf]**
*submitted on 2017-05-09 12:43:55*

**Authors:** Edgar Valdebenito

**Comments:** 19 Pages.

This note presents formulas related with Euler-Mascheroni constant and fractals.

**Category:** Number Theory

[130] **viXra:1705.0150 [pdf]**
*submitted on 2017-05-09 07:37:33*

**Authors:** Stephen I. Ternyik

**Comments:** 9 Pages.

Economic Research into Exponentiality.

**Category:** Social Science

[129] **viXra:1705.0149 [pdf]**
*submitted on 2017-05-09 07:55:33*

**Authors:** George Rajna

**Comments:** 29 Pages.

Some of the world's leading technology companies are trying to build massive quantum computers that rely on materials super-cooled to near absolute zero, the theoretical temperature at which atoms would cease to move. [17] While technologies that currently run on classical computers, such as Watson, can help find patterns and insights buried in vast amounts of existing data, quantum computers will deliver solutions to important problems where patterns cannot be seen because the data doesn't exist and the possibilities that you need to explore to get to the answer are too enormous to ever be processed by classical computers. [16] Through a collaboration between the University of Calgary, The City of Calgary and researchers in the United States, a group of physicists led by Wolfgang Tittel, professor in the Department of Physics and Astronomy at the University of Calgary have successfully demonstrated teleportation of a photon (an elementary particle of light) over a straight-line distance of six kilometers using The City of Calgary's fiber optic cable infrastructure. [15] Optical quantum technologies are based on the interactions of atoms and photons at the single-particle level, and so require sources of single photons that are highly indistinguishable – that is, as identical as possible. Current single-photon sources using semiconductor quantum dots inserted into photonic structures produce photons that are ultrabright but have limited indistinguishability due to charge noise, which results in a fluctuating electric field. [14] A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy's Oak Ridge National Laboratory. [13] A source of single photons that meets three important criteria for use in quantum-information systems has been unveiled in China by an international team of physicists. Based on a quantum dot, the device is an efficient source of photons that emerge as solo particles that are indistinguishable from each other. The researchers are now trying to use the source to create a quantum computer based on "boson sampling". [11] With the help of a semiconductor quantum dot, physicists at the University of Basel have developed a new type of light source that emits single photons. For the first time, the researchers have managed to create a stream of identical photons. [10] Optical photons would be ideal carriers to transfer quantum information over large distances. Researchers envisage a network where information is processed in certain nodes and transferred between them via photons. [9] While physicists are continually looking for ways to unify the theory of relativity, which describes large-scale phenomena, with quantum theory, which describes small-scale phenomena, computer scientists are searching for technologies to build the quantum computer using Quantum Information. In August 2013, the achievement of "fully deterministic" quantum teleportation, using a hybrid technique, was reported. On 29 May 2014, scientists announced a reliable way of transferring data by quantum teleportation. Quantum teleportation of data had been done before but with highly unreliable methods. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to build the Quantum Computer with the help of Quantum Information.

**Category:** Quantum Physics

[128] **viXra:1705.0148 [pdf]**
*submitted on 2017-05-09 08:27:57*

**Authors:** George Rajna

**Comments:** 26 Pages.

The power of big data is used in a strategy developed by A*STAR to improve the security of networks of internet-connected objects, known as the Internet of Things (IoT), technology which will make everything from streetlights to refrigerators 'smart'. [16] Through a collaboration between the University of Calgary, The City of Calgary and researchers in the United States, a group of physicists led by Wolfgang Tittel, professor in the Department of Physics and Astronomy at the University of Calgary have successfully demonstrated teleportation of a photon (an elementary particle of light) over a straight-line distance of six kilometers using The City of Calgary's fiber optic cable infrastructure. [15] Optical quantum technologies are based on the interactions of atoms and photons at the single-particle level, and so require sources of single photons that are highly indistinguishable – that is, as identical as possible. Current single-photon sources using semiconductor quantum dots inserted into photonic structures produce photons that are ultrabright but have limited indistinguishability due to charge noise, which results in a fluctuating electric field. [14] A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy's Oak Ridge National Laboratory. [13] A source of single photons that meets three important criteria for use in quantum-information systems has been unveiled in China by an international team of physicists. Based on a quantum dot, the device is an efficient source of photons that emerge as solo particles that are indistinguishable from each other. The researchers are now trying to use the source to create a quantum computer based on "boson sampling". [11] With the help of a semiconductor quantum dot, physicists at the University of Basel have developed a new type of light source that emits single photons. For the first time, the researchers have managed to create a stream of identical photons. [10] Optical photons would be ideal carriers to transfer quantum information over large distances. Researchers envisage a network where information is processed in certain nodes and transferred between them via photons. [9] While physicists are continually looking for ways to unify the theory of relativity, which describes large-scale phenomena, with quantum theory, which describes small-scale phenomena, computer scientists are searching for technologies to build the quantum computer using Quantum Information. In August 2013, the achievement of "fully deterministic" quantum teleportation, using a hybrid technique, was reported. On 29 May 2014, scientists announced a reliable way of transferring data by quantum teleportation. Quantum teleportation of data had been done before but with highly unreliable methods. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to build the Quantum Computer with the help of Quantum Information.

**Category:** Quantum Physics

[127] **viXra:1705.0147 [pdf]**
*replaced on 2017-05-12 05:54:22*

**Authors:** D. Chakalov

**Comments:** 6 Pages. Two references added. Final version.

We model the physicalized manifestation of the Universe as bootstrapped ‘Brain of the Universe’ and seek evidence for its brain-like functional organization, resulting from the Holon of the Universe facilitated by space-like correlated gravitational holomovement and rotation. The orthodox model of gravity, based on "tangent vectors" and "curvature of spacetime", is replaced with the proposal that the physicalized clocks and rulers are very flexible ‘jackets’ (cf. John’s jackets parable in CEN.pdf), which can slow down or speed up viz. shrink or expand, leading to perfectly correlated Brain of the Universe living in so-called ‘relative scale’ (RS) spacetime. The question of Universal Mind, complementing the Brain of the Universe, pertains to physical theology and the doctrine of trialism, and was examined in previous publications (Sec. 6 in spacetime.pdf).

**Category:** Relativity and Cosmology

[126] **viXra:1705.0146 [pdf]**
*submitted on 2017-05-09 10:30:21*

**Authors:** Yannan Yang

**Comments:** 4 Pages.

By analyzing the charge and electric field distribution for some charge systems that containing neutral conductor cavity, we found phenomena that violate Gauss’ Law. In some cases, the net electric flux through a closed surface is equal to zero, although there is net electric charge within it. In the other case, there is net electric flux through a closed surface, but the net electric charge within that surface is not zero.

**Category:** Classical Physics

[125] **viXra:1705.0145 [pdf]**
*submitted on 2017-05-09 10:47:31*

**Authors:** George Rajna

**Comments:** 24 Pages.

The movement of atoms through a material can cause problems under certain circumstances. Atomic-resolution electron microscopy has enabled researchers at Linköping University in Sweden to observe for the first time a phenomenon that has eluded materials scientists for many decades. [15] By taking advantage of a phenomenon known as "quantum mechanical squeezing," researchers have conceptually designed a new method of applying atomic force microscopy. [14] In modern physics of the past century, understanding the electronic properties and interactions between electrons inside matter has been a major challenge. [13] An international team of researchers have found evidence of a mysterious new state of matter, first predicted 40 years ago, in a real material. This state, known as a quantum spin liquid, causes electrons-thought to be indivisible building blocks of nature-to break into pieces. [12] In a single particle system, the behavior of the particle is well understood by solving the Schrödinger equation. Here the particle possesses wave nature characterized by the de Broglie wave length. In a many particle system, on the other hand, the particles interact each other in a quantum mechanical way and behave as if they are "liquid". This is called quantum liquid whose properties are very different from that of the single particle case. [11] Quantum coherence and quantum entanglement are two landmark features of quantum physics, and now physicists have demonstrated that the two phenomena are "operationally equivalent"—that is, equivalent for all practical purposes, though still conceptually distinct. This finding allows physicists to apply decades of research on entanglement to the more fundamental but less-well-researched concept of coherence, offering the possibility of advancing a wide range of quantum technologies. [10] The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** Condensed Matter

[124] **viXra:1705.0144 [pdf]**
*submitted on 2017-05-09 04:11:47*

**Authors:** Philip Gibbs

**Comments:** 9 Pages.

Humanity faces many dangers from climate change and wars to asteroid impacts that could harm our future. Often logical reasoning does not seem to play a strong part in discussions on such subjects and even peer-review is flawed. I contend that the solution is a better system of open peer-review.

**Category:** General Science and Philosophy

[123] **viXra:1705.0143 [pdf]**
*submitted on 2017-05-09 06:25:08*

**Authors:** Dhananjay P. Mehendale

**Comments:** 11 pages

An important application of Grover's search algorithm [2] in the domain of experimental
physics is its use in the synthesis of any selected superposition state [3]. This paper is about
showing the utility of factorisation using [1] of the quantum state to be synthesised. We first factorise the given quantum state to be synthesised when it is factorable. We then make use of these factors and construct the corresponding operators useful for synthesis of those factors.
We then build the operator called synthesizer by taking tensor product of these operators constructed using factors and useful for synthesis of those factors. We then apply the synthesizer
made up of the tensor product of the operators that we built using the corresponding factors on
the suitable register whose all the qubits have been initialised to |0>: Further, this register is also
made up of tensor product of registers of suitable lengths and the first qubit of all these registers
is ancilla qubit initialised to |0>: We show that we can achieve the speeding up of the process of
synthesising the desired quantum state with our modified algorithm when the state is factorable
and has at least two factors. It is shown here that the greater the number of factors of the
quantum state, the easier it is to synthesise. We will see that in fact the task of synthesising an
n-qubit quantum state which is completely factorable into n single qubit factors is exponentially
easier than the task of synthesising an n-qubit completely entangled quantum state having no
factors.

**Category:** Quantum Physics

[122] **viXra:1705.0142 [pdf]**
*replaced on 2017-05-13 03:50:38*

**Authors:** Carlos Castro

**Comments:** 13 Pages. Submitted to Mod. Phys. Letts A

An approach to solving the Riemann Hypothesis is revisited within the framework of the special properties of $\Theta$ (theta) functions, and the notion of $ {\cal C } { \cal T} $ invariance. The conjugation operation $ {\cal C }$ amounts to complex scaling transformations, and the $ {\cal T } $ operation
$ t \rightarrow ( 1/ t ) $ amounts to the reversal $ log (t) \rightarrow - log ( t ) $. A judicious scaling-like operator is constructed whose spectrum $E_s = s ( 1 - s ) $ is real-valued, leading to $ s = {1\over 2} + i \rho$,
and/or $ s $ = real. These values are the location of the non-trivial and trivial zeta zeros, respectively.
A thorough analysis of the one-to-one correspondence among the zeta zeros, and the orthogonality conditions among pairs of eigenfunctions, reveals that $no$ zeros exist off the critical line. The role of the $ {\cal C }, {\cal T } $ transformations, and the properties of the Mellin transform of $ \Theta$ functions were essential in our construction.

**Category:** Number Theory

[121] **viXra:1705.0141 [pdf]**
*submitted on 2017-05-09 06:35:32*

**Authors:** George Rajna

**Comments:** 26 Pages.

The Tohoku University research group of Professor Keiichi Edamatsu and Postdoctoral fellow Naofumi Abe has demonstrated dynamically and statically unpolarized single-photon generation using diamond. [16] Through a collaboration between the University of Calgary, The City of Calgary and researchers in the United States, a group of physicists led by Wolfgang Tittel, professor in the Department of Physics and Astronomy at the University of Calgary have successfully demonstrated teleportation of a photon (an elementary particle of light) over a straight-line distance of six kilometers using The City of Calgary's fiber optic cable infrastructure. [15] Optical quantum technologies are based on the interactions of atoms and photons at the single-particle level, and so require sources of single photons that are highly indistinguishable – that is, as identical as possible. Current single-photon sources using semiconductor quantum dots inserted into photonic structures produce photons that are ultrabright but have limited indistinguishability due to charge noise, which results in a fluctuating electric field. [14] A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy's Oak Ridge National Laboratory. [13] A source of single photons that meets three important criteria for use in quantum-information systems has been unveiled in China by an international team of physicists. Based on a quantum dot, the device is an efficient source of photons that emerge as solo particles that are indistinguishable from each other. The researchers are now trying to use the source to create a quantum computer based on "boson sampling". [11] With the help of a semiconductor quantum dot, physicists at the University of Basel have developed a new type of light source that emits single photons. For the first time, the researchers have managed to create a stream of identical photons. [10] Optical photons would be ideal carriers to transfer quantum information over large distances. Researchers envisage a network where information is processed in certain nodes and transferred between them via photons. [9] While physicists are continually looking for ways to unify the theory of relativity, which describes large-scale phenomena, with quantum theory, which describes small-scale phenomena, computer scientists are searching for technologies to build the quantum computer using Quantum Information. In August 2013, the achievement of "fully deterministic" quantum teleportation, using a hybrid technique, was reported. On 29 May 2014, scientists announced a reliable way of transferring data by quantum teleportation. Quantum teleportation of data had been done before but with highly unreliable methods. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to build the Quantum Computer with the help of Quantum Information.

**Category:** Quantum Physics

[120] **viXra:1705.0140 [pdf]**
*submitted on 2017-05-09 06:50:09*

**Authors:** Jeffrey Joseph Wolynski

**Comments:** 1 Page. 1 illustrative graph

HAT-P-6 the host, and HAT-P-6b the companion, are placed on a graph according to stellar metamorphosis to determine their ages and stage of stellar evolution.

**Category:** Astrophysics

[119] **viXra:1705.0139 [pdf]**
*submitted on 2017-05-09 07:01:48*

**Authors:** George Rajna

**Comments:** 27 Pages.

The global race towards a functioning quantum computer is on. With future quantum computers, we will be able to solve previously impossible problems and develop, for example, complex medicines, fertilizers, or artificial intelligence. [17] The Tohoku University research group of Professor Keiichi Edamatsu and Postdoctoral fellow Naofumi Abe has demonstrated dynamically and statically unpolarized single-photon generation using diamond. [16] Through a collaboration between the University of Calgary, The City of Calgary and researchers in the United States, a group of physicists led by Wolfgang Tittel, professor in the Department of Physics and Astronomy at the University of Calgary have successfully demonstrated teleportation of a photon (an elementary particle of light) over a straight-line distance of six kilometers using The City of Calgary's fiber optic cable infrastructure. [15] Optical quantum technologies are based on the interactions of atoms and photons at the single-particle level, and so require sources of single photons that are highly indistinguishable – that is, as identical as possible. Current single-photon sources using semiconductor quantum dots inserted into photonic structures produce photons that are ultrabright but have limited indistinguishability due to charge noise, which results in a fluctuating electric field. [14] A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy's Oak Ridge National Laboratory. [13] A source of single photons that meets three important criteria for use in quantum-information systems has been unveiled in China by an international team of physicists. Based on a quantum dot, the device is an efficient source of photons that emerge as solo particles that are indistinguishable from each other. The researchers are now trying to use the source to create a quantum computer based on "boson sampling". [11] With the help of a semiconductor quantum dot, physicists at the University of Basel have developed a new type of light source that emits single photons. For the first time, the researchers have managed to create a stream of identical photons. [10] Optical photons would be ideal carriers to transfer quantum information over large distances. Researchers envisage a network where information is processed in certain nodes and transferred between them via photons. [9] While physicists are continually looking for ways to unify the theory of relativity, which describes large-scale phenomena, with quantum theory, which describes small-scale phenomena, computer scientists are searching for technologies to build the quantum computer using Quantum Information. In August 2013, the achievement of "fully deterministic" quantum teleportation, using a hybrid technique, was reported. On 29 May 2014, scientists announced a reliable way of transferring data by quantum teleportation. Quantum teleportation of data had been done before but with highly unreliable methods. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to build the Quantum Computer with the help of Quantum Information.

**Category:** Quantum Physics

[118] **viXra:1705.0138 [pdf]**
*replaced on 2017-05-11 09:43:59*

**Authors:** Bertrand Wong

**Comments:** 3 Pages.

It is felt by many that faster-than-light particles (tachyons) exist though none has been detected so far. Is it really possible to detect these particles? Some methods are brought up.

**Category:** Relativity and Cosmology

[117] **viXra:1705.0137 [pdf]**
*submitted on 2017-05-08 18:10:41*

**Authors:** H. J. Spencer

**Comments:** 82 Pages. A milestone paper in the author's research programme.

This paper re-opens the debate on the failure of quantum mechanics to provide an understandable view of micro-reality. A critique is offered of the commonly accepted ‘Copenhagen Interpretation’ of a theory that is only a mathematical approach to the level of reality characterized by atoms and electrons. This critique is based on the oldest approach to thinking about nature for over 2500 years, known as Natural Philosophy.
Quantum mechanics (QM) was developed over the first quarter of the 20th Century, when scientists were enthralled by a new philosophy known as Positivism, whose foundations were based on the assumption that material objects exist only when measured by humans – this central assumption conflates epistemology (knowledge) with ontology (existence). The present critique rejects this human-centered view of reality by assuming material reality has existed long before (and will persist long after) human beings (“Realism”). The defensive view that the micro-world is too different to understand using regular thinking (and only a mathematical approach is possible) is rejected totally.
At least 12 earlier QM interpretations are critically analyzed, indicating the broad interest in “what does QM mean?”
The standard theory of quantum mechanics is thus constructed on only how the micro-world appears to macro measurements - as such, it cannot offer any view of how the foundations of the world are acting when humans are not observing it - this has generated almost 100 years of confusion and contradiction at the very heart of physics. Significantly, we live in a world that is not being measured by scientists but is interacting with itself and with us.
QM has failed to provide explanations: only recipes (meaningless equations), not insights. Physics has returned to the pre-Newtonian world of Ptolemaic phenomenology: only verifiable numbers without real understanding.
The focus needs to be on an explicit linkage between the micro-world, when left to itself, and our mental models of this sphere of material reality, via the mechanism of measurement. This limits the role of measurement to confirming our mental models of reality but never confusing these with a direct image of ‘the thing in itself’. This implies a deep divide between reality and appearances, as Kant suggested.
This paper includes an original analysis of several major assumptions that have been implicit in Classical Mechanics (CM) that were acceptable in the macroscopic domain of reality, demonstrated by its proven successes. Unfortunately, only a few of these assumptions were challenged by the developers of QM. We now show that these other assumptions are still generating confusions in the interpretation of QM and blocking further progress in the understanding of the microscopic domain. Several of these flawed assumptions were introduced by Newton to support the use of continuum mathematics as a model of nature. This paper proposes that it is the attempt to preserve continuum mathematics (especially calculus), which drives much of the mystery and confusion behind all attempts at understanding quantum mechanics. The introduction of discrete mathematics is proposed to help analyze the discrete interactions between the quintessential quantum objects: the electrons and their novel properties.
A related paper demonstrates that it is possible to create a point-particle theory of electrons that explains all their peculiar (and ‘paradoxical’) behavior using only physical hypotheses and discrete mathematics without introducing the continuum mathematical ideas of fields or waves. Another (related) paper proves that all the known results for the hydrogen atom can also be exactly calculated from this new perspective with the discrete mathematics.
* Surrey, B.C. Canada (604) 542-2299 spsi99@telus.net
Version 2.015 08-05-2017 Begun 23-06-2008 {pp. 82, 70.2 Kw; 800 KB}

**Category:** Quantum Physics

[116] **viXra:1705.0136 [pdf]**
*submitted on 2017-05-09 00:59:01*

**Authors:** Zheng-chen Liang

**Comments:** 12 Pages. This paper has been published by chinaXiv:201608.00018, but somehow deleted by chinaXiv on May 9, 2017.

We derived the Lie-dependent masses of certain particles gauged as TeVeS in considered Lie groups raised from gauge couplings with constant global sections of singlet Higgs under the algorithm on mass terms which comes out naturally from the kinetic part of our considered TaLie action, and also available on the gauge fields as connections in formed Y-M actions. With the only parameters, \textit{scaled mass} $M(H^{D})\in\mathbb{R^{+}}$ of each Higgs section introduced in this mechanism, we concretely computed the masses $m_{W^{\pm}}$, $m_{Z^{0}}$, $m_{X}$ and $m_{H}$ under the gauge selection $E_{8(-24)}$ in \textit{Lie Group Cosmology} (LGC), figuring out how the masses of every different singlet Higgs bosons all equal one real number $\sqrt{2}\cdot M(H^{\Sigma})$. When comparing the results with recent experiments at LHC, we find the singlet Higgs spontaneity with algorithms derived from our considered action under the gauge selection of LGC is consistent with current data including the diphoton excess at $750$ GeV, as well as stating some important implications from the derived Lie-dependent masses and our constructions on the mechanism.

**Category:** High Energy Particle Physics

[115] **viXra:1705.0135 [pdf]**
*replaced on 2017-11-05 08:02:40*

**Authors:** A. I. Andreus

**Comments:** 4 Pages. In Russian and English of Google

False cience spawned the Big Bang Theory.
An exquisite pseudoscience, since all points over i have not yet been put, and scientists (scientists) do not dethrone such benchmarks in a person's world view.
Pseudoscientific Theory of the Big Bang.
Propaganda of the Big Bang Theory illustrates the loss of honor by scientists around the globe for more than a century for the blessings and content of their lives from society on planet Earth ...

**Category:** Relativity and Cosmology

[114] **viXra:1705.0134 [pdf]**
*submitted on 2017-05-08 07:46:32*

**Authors:** George Rajna

**Comments:** 27 Pages.

While technologies that currently run on classical computers, such as Watson, can help find patterns and insights buried in vast amounts of existing data, quantum computers will deliver solutions to important problems where patterns cannot be seen because the data doesn't exist and the possibilities that you need to explore to get to the answer are too enormous to ever be processed by classical computers. [16] Through a collaboration between the University of Calgary, The City of Calgary and researchers in the United States, a group of physicists led by Wolfgang Tittel, professor in the Department of Physics and Astronomy at the University of Calgary have successfully demonstrated teleportation of a photon (an elementary particle of light) over a straight-line distance of six kilometers using The City of Calgary's fiber optic cable infrastructure. [15] Optical quantum technologies are based on the interactions of atoms and photons at the single-particle level, and so require sources of single photons that are highly indistinguishable – that is, as identical as possible. Current single-photon sources using semiconductor quantum dots inserted into photonic structures produce photons that are ultrabright but have limited indistinguishability due to charge noise, which results in a fluctuating electric field. [14] A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy's Oak Ridge National Laboratory. [13] A source of single photons that meets three important criteria for use in quantum-information systems has been unveiled in China by an international team of physicists. Based on a quantum dot, the device is an efficient source of photons that emerge as solo particles that are indistinguishable from each other. The researchers are now trying to use the source to create a quantum computer based on "boson sampling". [11] With the help of a semiconductor quantum dot, physicists at the University of Basel have developed a new type of light source that emits single photons. For the first time, the researchers have managed to create a stream of identical photons. [10] Optical photons would be ideal carriers to transfer quantum information over large distances. Researchers envisage a network where information is processed in certain nodes and transferred between them via photons. [9] While physicists are continually looking for ways to unify the theory of relativity, which describes large-scale phenomena, with quantum theory, which describes small-scale phenomena, computer scientists are searching for technologies to build the quantum computer using Quantum Information. In August 2013, the achievement of "fully deterministic" quantum teleportation, using a hybrid technique, was reported. On 29 May 2014, scientists announced a reliable way of transferring data by quantum teleportation. Quantum teleportation of data had been done before but with highly unreliable methods. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to build the Quantum Computer with the help of Quantum Information.

**Category:** Quantum Physics

[113] **viXra:1705.0133 [pdf]**
*submitted on 2017-05-08 07:55:21*

**Authors:** Ramesh Chandra Bagadi

**Comments:** 2 Pages.

In this research investigation, the author has presented a Recursive Past Equation and a Recursive Future Equation based on the Ananda-Damayanthi Similarity Measure considered to Exhaustion [1].

**Category:** Mathematical Physics

[112] **viXra:1705.0132 [pdf]**
*submitted on 2017-05-08 07:59:29*

**Authors:** George Rajna

**Comments:** 17 Pages.

Quantum entanglement, one of the most intriguing features of multi-particle quantum systems, has become a fundamental building block in both quantum information processing and quantum computation. [10]
The microscopic world is governed by the rules of quantum mechanics, where the properties of a particle can be completely undetermined and yet strongly correlated with those of other particles. Physicists from the University of Basel have observed these so-called Bell correlations for the first time between hundreds of atoms. [9]
For the past 100 years, physicists have been studying the weird features of quantum physics, and now they're trying to put these features to good use. One prominent example is that quantum superposition (also known as quantum coherence)—which is the property that allows an object to be in two states at the same time—has been identified as a useful resource for quantum communication technologies. [8]
Quantum entanglement—which occurs when two or more particles are correlated in such a way that they can influence each other even across large distances—is not an all-or-nothing phenomenon, but occurs in various degrees. The more a quantum state is entangled with its partner, the better the states will perform in quantum information applications. Unfortunately, quantifying entanglement is a difficult process involving complex optimization problems that give even physicists headaches. [7]
A trio of physicists in Europe has come up with an idea that they believe would allow a person to actually witness entanglement. Valentina Caprara Vivoli, with the University of Geneva, Pavel Sekatski, with the University of Innsbruck and Nicolas Sangouard, with the University of Basel, have together written a paper describing a scenario where a human subject would be able to witness an instance of entanglement—they have uploaded it to the arXiv server for review by others. [6]
The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron’s spin also, building the Bridge between the Classical and Quantum Theories.
The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry.
The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory.

**Category:** Quantum Physics

[111] **viXra:1705.0131 [pdf]**
*submitted on 2017-05-08 08:59:21*

**Authors:** George Rajna

**Comments:** 21 Pages.

Jie Ma, a professor from Shanghai Jiao Tong University in China, is using neutrons at Oak Ridge National Laboratory's High Flux Isotope Reactor to discover a three-dimensional image of the magnetic lattice of an oxide material (Ba2CoTeO6) containing quantum properties that could provide new insight into how electron "spins" can improve data processing and storage in computers. [13]
An international team of researchers have found evidence of a mysterious new state of matter, first predicted 40 years ago, in a real material. This state, known as a quantum spin liquid, causes electrons - thought to be indivisible building blocks of nature - to break into pieces. [12]
In a single particle system, the behavior of the particle is well understood by solving the Schrödinger equation. Here the particle possesses wave nature characterized by the de Broglie wave length. In a many particle system, on the other hand, the particles interact each other in a quantum mechanical way and behave as if they are "liquid". This is called quantum liquid whose properties are very different from that of the single particle case. [11]
Quantum coherence and quantum entanglement are two landmark features of quantum physics, and now physicists have demonstrated that the two phenomena are "operationally equivalent"—that is, equivalent for all practical purposes, though still conceptually distinct. This finding allows physicists to apply decades of research on entanglement to the more fundamental but less-well-researched concept of coherence, offering the possibility of advancing a wide range of quantum technologies. [10]
The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron’s spin also, building the Bridge between the Classical and Quantum Theories.
The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry.
The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory.
The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** Quantum Physics

[110] **viXra:1705.0130 [pdf]**
*submitted on 2017-05-07 08:49:48*

**Authors:** Marius Coman

**Comments:** 2 Pages.

In this paper I conjecture that there exist an infinity of Poulet numbers P such that concatenating P to the left with the number (s(P) – 1)/2, where s is the sum of digits of P, is obtained a prime; also I make the same conjecture for (s(P) – 1)/3 respectively for (s(P) – 1)/6.

**Category:** Number Theory

[109] **viXra:1705.0129 [pdf]**
*submitted on 2017-05-07 08:51:48*

**Authors:** Marius Coman

**Comments:** 2 Pages.

In a previous paper, “Primes obtained concatenating a Poulet number P with (s - 1)/n where s digits sum of P and n is 2, 3 or 6”, I noticed that in almost all the cases that I considered if a prime was obtained through this concatenation than the digits sum of P was a prime. That gave me the idea for this paper where I observe that for many primes p having an odd prime digit sum s there exist a prime obtained concatenating p to the left with a divisor of s – 1 (including 1 and s – 1).

**Category:** Number Theory

[108] **viXra:1705.0128 [pdf]**
*submitted on 2017-05-07 09:59:15*

**Authors:** Ramesh Chandra Bagadi

**Comments:** 2 Pages.

In this research investigation, the author has presented a Recursive Past Equation and a Recursive Future Equation based on the Ananda-Damayanthi Similarity Measure and its series considered to Exhaustion [1].

**Category:** Statistics

[107] **viXra:1705.0127 [pdf]**
*submitted on 2017-05-07 11:13:19*

**Authors:** Ramesh Chandra Bagadi

**Comments:** 2 Pages.

In this research investigation, the author has presented a Recursive Past Equation and a Recursive Future Equation based on the Ananda-Damayanthi Similarity Measure considered to Exhaustion [1].

**Category:** Statistics

[106] **viXra:1705.0126 [pdf]**
*submitted on 2017-05-07 11:22:44*

**Authors:** Marius Coman

**Comments:** 2 Pages.

In a previous paper, “Primes obtained concatenating to the left a prime having an odd prime digit sum s with a divisor of s - 1”, I observed that for many primes p having an odd prime digit sum s there exist a prime obtained concatenating p to the left with a divisor of s – 1. In this paper I conjecture that for any prime p, p ≠ 5, having an odd prime digit sum s there exist an infinity of primes obtained concatenating to the left p with multiples of s – 1. Yet I conjecture that there exist at least a prime obtained concatenating n*(s – 1) with p such that n < sqr s.

**Category:** Number Theory

[105] **viXra:1705.0125 [pdf]**
*submitted on 2017-05-07 12:05:19*

**Authors:** Jeffrey Joseph Wolynski

**Comments:** 2 Pages. 1 screenshot, 1 illustrative graph

It is proposed that we can travel through time by using good theory, but not the time travel that is represented in Hollywood. To do this we must use our common sense and good theory. Since the universe recycles itself constantly and is eternal, we can find objects that exist and time periods that are very similar to our own past and are good analogs to our future.

**Category:** Relativity and Cosmology

[104] **viXra:1705.0124 [pdf]**
*submitted on 2017-05-07 16:49:38*

**Authors:** Rodolfo A. Frino

**Comments:** 5 Pages.

In this paper I derive the expression for the Planck force from the Heisenberg uncertainty
relations.

**Category:** Quantum Physics

[103] **viXra:1705.0123 [pdf]**
*submitted on 2017-05-07 18:43:23*

**Authors:** Osvaldo F. Schilling

**Comments:** 8 Pages. 1 table and 2 figures

In previous papers the author has analyzed data for leptons and baryons which converges to the association of magnetic energy to the rest enegies of these particles. In this paper a crucial parameter in this model, the number of flux quanta n trapped inside the region covered by an intrinsic motion of a particle, is considered in detail. Strictly fitting theory to experiment for baryons results in fractionary n which lie close but deviate from the expected numbers from a classical calculation. We show that the data diaplay a tendency to form Shapiro-like steps at integer numbers of flux quanta, which seems at least in part responsible for the observed deviations from the classical prediction.

**Category:** Quantum Physics

[102] **viXra:1705.0122 [pdf]**
*submitted on 2017-05-06 14:40:55*

**Authors:** Johan Noldus

**Comments:** 5 Pages.

We study the geometry of a simplicial complexes from an algebraic point of view and devise general quantization rules; the rules emerging in spin foam theory are shown to comprise a particular subcase.

**Category:** Quantum Gravity and String Theory

[101] **viXra:1705.0119 [pdf]**
*replaced on 2017-05-12 11:19:53*

**Authors:** Jaidev B. Parmar

**Comments:** 8 Pages.

A model of the Universe is constructed
and a number of problems in Contemporary physics like Baryon Asymmetry, Dark Matter, Proton Decay, Galaxy Rotation Curve, Quasars, SMBH, Relativistic Astrophysical Jets, Coronal Heating, Solar Cycle, Supernovae mechanism, Magnetar magnetic field, Cosmological Lithium problem, Solar neutrino problem, Existence of Black holes and Electron spin are discussed in
the light of the Hypothetical Universe. It is proposed that there exists stable neutrons and antineutrons that could explain the Dark matter and the missing antimatter of the Universe.

**Category:** Astrophysics

[100] **viXra:1705.0118 [pdf]**
*submitted on 2017-05-06 11:30:37*

**Authors:** Uwe Kayser-Herold

**Comments:** 2 Pages.

A transformation of the conditional equation for the magnetic flux quantum $\vec{\Phi}_{0} = \frac{2\pi}{e} \hspace{2} \vec{\hbar}/2$ yields the conditional equation for the quantum of electromagnetic canonical angular momentum: $ \frac{e}{2 \pi} \hspace{2} \vec{\Phi}_{0} = \vec{\hbar}/2$.

**Category:** Quantum Physics

[99] **viXra:1705.0117 [pdf]**
*replaced on 2017-05-31 18:39:21*

**Authors:** Jason Cole

**Comments:** 4 Pages.

There is exciting research trying to connect the nontrivial zeros of the Riemann Zeta function to Quantum mechanics as a breakthrough towards proving the 160-year-old Riemann Hypothesis. This research offers a radically new approach.
Most research up to this point have focused only on mapping the nontrivial zeros directly to eigenvalues. Those attempts have failed or didn’t yield any new breakthrough. This research takes a radically different approach by focusing on the quantum mechanical properties of the wave graph of Zeta as ζ(0.5+it) and not the nontrivial zeros directly. The conjecture is that the wave forms in the graph of the Riemann Zeta function ζ(0.5+it) is a wave function ψ. It is made of a Complex version of the Parity Operator wave function. The Riemann Zeta function consists of linked Even and Odd Parity Operator wave functions on the critical line. From this new approach, it shows the Complex version of the Parity Operator wave function is Hermitian and it eigenvalues matches the zeros of the Zeta function.

**Category:** Number Theory

[98] **viXra:1705.0116 [pdf]**
*submitted on 2017-05-05 13:13:50*

**Authors:** George Rajna

**Comments:** 30 Pages.

Now MIT physicists have found that a flake of graphene, when brought in close proximity with two superconducting materials, can inherit some of those materials' superconducting qualities. As graphene is sandwiched between superconductors, its electronic state changes dramatically, even at its center. [18] EPFL scientists have now carried out a study on a lithium-containing copper oxide and have found that its electrons are 2.5 times lighter than was predicted by theoretical calculations. [17] Washington State University physicists have created a fluid with negative mass, which is exactly what it sounds like. Push it, and unlike every physical object in the world we know, it doesn't accelerate in the direction it was pushed. It accelerates backwards. [16] When matter is cooled to near absolute zero, intriguing phenomena emerge. These include supersolidity, where crystalline structure and frictionless flow occur together. ETH researchers have succeeded in realising this strange state experimentally for the first time. [15] Helium atoms are loners. Only if they are cooled down to an extremely low temperature do they form a very weakly bound molecule. In so doing, they can keep a tremendous distance from each other thanks to the quantum-mechanical tunnel effect. [14] Inside a new exotic crystal, physicist Martin Mourigal has observed strong indications of "spooky" action, and lots of it. The results of his experiments, if corroborated over time, would mean that the type of crystal is a rare new material that can house a quantum spin liquid. [13] An international team of researchers have found evidence of a mysterious new state of matter, first predicted 40 years ago, in a real material. This state, known as a quantum spin liquid, causes electrons-thought to be indivisible building blocks of nature-to break into pieces. [12] In a single particle system, the behavior of the particle is well understood by solving the Schrödinger equation. Here the particle possesses wave nature characterized by the de Broglie wave length. In a many particle system, on the other hand, the particles interact each other in a quantum mechanical way and behave as if they are "liquid". This is called quantum liquid whose properties are very different from that of the single particle case. [11] Quantum coherence and quantum entanglement are two landmark features of quantum physics, and now physicists have demonstrated that the two phenomena are "operationally equivalent"—that is, equivalent for all practical purposes, though still conceptually distinct. This finding allows physicists to apply decades of research on entanglement to the more fundamental but less-well-researched concept of coherence, offering the possibility of advancing a wide range of quantum technologies. [10] The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** Condensed Matter

[97] **viXra:1705.0115 [pdf]**
*submitted on 2017-05-05 16:53:48*

**Authors:** Christopher Goddard

**Comments:** 23 Pages.

Using an extension of the idea of the radical of a number, as well as a few other ideas, it is indicated as to why one might expect the Oesterle-Masser conjecture to be true. Based on structural elements arising from this proof, a criterion is then developed and shown to be potentially sufficient to resolve two relatively deep conjectures about the structure of the prime numbers. A sketch is consequently provided as to how it might be possible to demonstrate this criterion, borrowing ideas from information theory and cybernetics.

**Category:** Number Theory

[96] **viXra:1705.0114 [pdf]**
*submitted on 2017-05-05 17:23:48*

**Authors:** Jeffrey Joseph Wolynski

**Comments:** 2 Pages. 1 diagram, 1 weblink

A strange phenomenon of language has appeared concerning the naming of host stars of new systems. For some reason scientists have abandoned their distinction of red dwarf/brown dwarf in favor of “ultra cool dwarf star”. It should be made aware for the public that this signals a crack in the foundation of accepted astrophysical interpretations and theories. Explanation is provided.

**Category:** Astrophysics

[95] **viXra:1705.0113 [pdf]**
*submitted on 2017-05-05 19:39:37*

**Authors:** Robert Stach

**Comments:** 31 Pages.

Teil 1: Erweiterung des Archimedischen Prinzips und die technischen Anwendungen - Teil 2: die Swastika und ihre technische Bedeutung

**Category:** Classical Physics

[94] **viXra:1705.0112 [pdf]**
*replaced on 2018-01-07 21:30:14*

**Authors:** Morio Kikuchi

**Comments:** 60 Pages.

We modify the present rule of soccer in order to be able to get points of plural kinds referring to the point system of rugby.

**Category:** General Science and Philosophy

[93] **viXra:1705.0111 [pdf]**
*submitted on 2017-05-05 11:00:03*

**Authors:** George Rajna

**Comments:** 27 Pages.

Experiments at the Institute of Physical Chemistry of the Polish Academy of Sciences in Warsaw prove that chemistry is also a suitable basis for storing information. The chemical bit, or 'chit,' is a simple arrangement of three droplets in contact with each other, in which oscillatory reactions occur. [20] Researchers at Sandia National Laboratories have developed new mathematical techniques to advance the study of molecules at the quantum level. [19] Correlation functions are often employed to quantify the relationships among interdependent variables or sets of data. A few years ago, two researchers proposed a property-testing problem involving Forrelation for studying the query complexity of quantum devices. [18] A team of researchers from Australia and the UK have developed a new theoretical framework to identify computations that occupy the 'quantum frontier'—the boundary at which problems become impossible for today's computers and can only be solved by a quantum computer. [17] Scientists at the University of Sussex have invented a groundbreaking new method that puts the construction of large-scale quantum computers within reach of current technology. [16] Physicists at the University of Bath have developed a technique to more reliably produce single photons that can be imprinted with quantum information. [15] Now a researcher and his team at Tyndall National Institute in Cork have made a 'quantum leap' by developing a technical step that could enable the use of quantum computers sooner than expected. [14] A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy's Oak Ridge National Laboratory. [13] A source of single photons that meets three important criteria for use in quantum-information systems has been unveiled in China by an international team of physicists. Based on a quantum dot, the device is an efficient source of photons that emerge as solo particles that are indistinguishable from each other. The researchers are now trying to use the source to create a quantum computer based on "boson sampling". [11]

**Category:** Quantum Physics

[92] **viXra:1705.0110 [pdf]**
*submitted on 2017-05-05 11:36:44*

**Authors:** George Rajna

**Comments:** 28 Pages.

City College of New York researchers have now demonstrated a new class of artificial media called photonic hypercrystals that can control light-matter interaction in unprecedented ways. [21]
Experiments at the Institute of Physical Chemistry of the Polish Academy of Sciences in Warsaw prove that chemistry is also a suitable basis for storing information. The chemical bit, or 'chit,' is a simple arrangement of three droplets in contact with each other, in which oscillatory reactions occur. [20]
Researchers at Sandia National Laboratories have developed new mathematical techniques to advance the study of molecules at the quantum level. [19]
Correlation functions are often employed to quantify the relationships among interdependent variables or sets of data. A few years ago, two researchers proposed a property-testing problem involving Forrelation for studying the query complexity of quantum devices. [18]
A team of researchers from Australia and the UK have developed a new theoretical framework to identify computations that occupy the 'quantum frontier'—the boundary at which problems become impossible for today's computers and can only be solved by a quantum computer. [17]
Scientists at the University of Sussex have invented a ground-breaking new method that puts the construction of large-scale quantum computers within reach of current technology. [16]
Physicists at the University of Bath have developed a technique to more reliably produce single photons that can be imprinted with quantum information. [15]
Now a researcher and his team at Tyndall National Institute in Cork have made a 'quantum leap' by developing a technical step that could enable the use of quantum computers sooner than expected. [14]
A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy's Oak Ridge National Laboratory. [13]
A source of single photons that meets three important criteria for use in quantum-information systems has been unveiled in China by an international team of physicists. Based on a quantum dot, the device is an efficient source of photons that emerge as solo particles that are indistinguishable from each other. The researchers are now trying to use the source to create a quantum computer based on "boson sampling". [11]
With the help of a semiconductor quantum dot, physicists at the University of Basel have developed a new type of light source that emits single photons. For the first time, the researchers have managed to create a stream of identical photons. [10]
Optical photons would be ideal carriers to transfer quantum information over large distances. Researchers envisage a network where information is processed in certain nodes and transferred between them via photons. [9]
While physicists are continually looking for ways to unify the theory of relativity, which describes large-scale phenomena, with quantum theory, which describes small-scale phenomena, computer scientists are searching for technologies to build the quantum computer using Quantum Information.
In August 2013, the achievement of "fully deterministic" quantum teleportation, using a hybrid technique, was reported. On 29 May 2014, scientists announced a reliable way of transferring data by quantum teleportation. Quantum teleportation of data had been done before but with highly unreliable methods.
The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron’s spin also, building the Bridge between the Classical and Quantum Theories.
The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry.
The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to build the Quantum Computer with the help of Quantum Information.

**Category:** Quantum Physics

[91] **viXra:1705.0109 [pdf]**
*submitted on 2017-05-05 09:04:13*

**Authors:** George Rajna

**Comments:** 28 Pages.

In the non-intuitive quantum domain, the phenomenon of counterfactuality is defined as the transfer of a quantum state from one site to another without any quantum or classical particle transmitted between them. [17] The quantum internet, which connects particles linked together by the principle of quantum entanglement, is like the early days of the classical internet – no one can yet imagine what uses it could have, according to Professor Ronald Hanson, from Delft University of Technology, the Netherlands, whose team was the first to prove that the phenomenon behind it was real. [16] Through a collaboration between the University of Calgary, The City of Calgary and researchers in the United States, a group of physicists led by Wolfgang Tittel, professor in the Department of Physics and Astronomy at the University of Calgary have successfully demonstrated teleportation of a photon (an elementary particle of light) over a straight-line distance of six kilometers using The City of Calgary's fiber optic cable infrastructure. [15] Optical quantum technologies are based on the interactions of atoms and photons at the single-particle level, and so require sources of single photons that are highly indistinguishable – that is, as identical as possible. Current single-photon sources using semiconductor quantum dots inserted into photonic structures produce photons that are ultrabright but have limited indistinguishability due to charge noise, which results in a fluctuating electric field. [14] A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy's Oak Ridge National Laboratory. [13] A source of single photons that meets three important criteria for use in quantum-information systems has been unveiled in China by an international team of physicists. Based on a quantum dot, the device is an efficient source of photons that emerge as solo particles that are indistinguishable from each other. The researchers are now trying to use the source to create a quantum computer based on "boson sampling". [11] With the help of a semiconductor quantum dot, physicists at the University of Basel have developed a new type of light source that emits single photons. For the first time, the researchers have managed to create a stream of identical photons. [10] Optical photons would be ideal carriers to transfer quantum information over large distances. Researchers envisage a network where information is processed in certain nodes and transferred between them via photons. [9] While physicists are continually looking for ways to unify the theory of relativity, which describes large-scale phenomena, with quantum theory, which describes small-scale phenomena, computer scientists are searching for technologies to build the quantum computer using Quantum Information. In August 2013, the achievement of "fully deterministic" quantum teleportation, using a hybrid technique, was reported. On 29 May 2014, scientists announced a reliable way of transferring data by quantum teleportation. Quantum teleportation of data had been done before but with highly unreliable methods. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to build the Quantum Computer with the help of Quantum Information.

**Category:** Quantum Physics

[90] **viXra:1705.0108 [pdf]**
*submitted on 2017-05-05 09:20:09*

**Authors:** Dimiter Dobrev

**Comments:** 17 Pages.

How do we describe the invisible? Let’s take a sequence: input, output, input, output ... Behind this sequence stands a world and the sequence of its internal states. We do not see the internal state of the world, but only a part of it. To describe that part which is invisible, we will use the concept of ‘incorrect move’ and its generalization ‘testable state’. Thus, we will reduce the problem of partial observability to the problem of full observability.

**Category:** Artificial Intelligence

[89] **viXra:1705.0107 [pdf]**
*submitted on 2017-05-05 02:32:22*

**Authors:** George Rajna

**Comments:** 39 Pages.

Scientists in Greece have devised a new form of biometric identification that relies on humans' ability to see flashes of light containing just a handful of photons. [22] A research team led by Professor CheolGi Kim has developed a biosensor platform using magnetic patterns resembling a spider web with detection capability 20 times faster than existing biosensors. [21] Researchers at Columbia University have made a significant step toward breaking the so-called "color barrier" of light microscopy for biological systems, allowing for much more comprehensive, system-wide labeling and imaging of a greater number of biomolecules in living cells and tissues than is currently attainable. [20] Scientists around the Nobel laureate Stefan Hell at the Max Planck Institute for Biophysical Chemistry in Göttingen have now achieved what was for a long time considered impossible – they have developed a new fluorescence microscope, called MINFLUX, allowing, for the first time, to optically separate molecules, which are only nanometers (one millionth of a millimeter) apart from each other. [19] Dipole orientation provides new dimension in super-resolution microscopy [18] Fluorescence is an incredibly useful tool for experimental biology and it just got easier to tap into, thanks to the work of a group of University of Chicago researchers. [17] Molecules that change colour can be used to follow in real-time how bacteria form a protective biofilm around themselves. This new method, which has been developed in collaboration between researchers at Linköping University and Karolinska Institutet in Sweden, may in the future become significant both in medical care and the food industry, where bacterial biofilms are a problem. [16] Researchers led by Carnegie Mellon University physicist Markus Deserno and University of Konstanz (Germany) chemist Christine Peter have developed a computer simulation that crushes viral capsids. By allowing researchers to see how the tough shells break apart, the simulation provides a computational window for looking at how viruses and proteins assemble. [15] IBM scientists have developed a new lab-on-a-chip technology that can, for the first time, separate biological particles at the nanoscale and could enable physicians to detect diseases such as cancer before symptoms appear. [14]

**Category:** Physics of Biology

[88] **viXra:1705.0106 [pdf]**
*submitted on 2017-05-05 03:43:32*

**Authors:** Ramesh Chandra Bagadi

**Comments:** 2 Pages.

In this research investigation, the author has presented a Recursive Future Equation based on the Ananda-Damayanthi Normalized Similarity Measure [1].

**Category:** Statistics

[87] **viXra:1705.0105 [pdf]**
*submitted on 2017-05-04 11:52:29*

**Authors:** George Rajna

**Comments:** 27 Pages.

The quantum internet, which connects particles linked together by the principle of quantum entanglement, is like the early days of the classical internet – no one can yet imagine what uses it could have, according to Professor Ronald Hanson, from Delft University of Technology, the Netherlands, whose team was the first to prove that the phenomenon behind it was real. [16] Through a collaboration between the University of Calgary, The City of Calgary and researchers in the United States, a group of physicists led by Wolfgang Tittel, professor in the Department of Physics and Astronomy at the University of Calgary have successfully demonstrated teleportation of a photon (an elementary particle of light) over a straight-line distance of six kilometers using The City of Calgary's fiber optic cable infrastructure. [15] Optical quantum technologies are based on the interactions of atoms and photons at the single-particle level, and so require sources of single photons that are highly indistinguishable – that is, as identical as possible. Current single-photon sources using semiconductor quantum dots inserted into photonic structures produce photons that are ultrabright but have limited indistinguishability due to charge noise, which results in a fluctuating electric field. [14] A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy's Oak Ridge National Laboratory. [13] A source of single photons that meets three important criteria for use in quantum-information systems has been unveiled in China by an international team of physicists. Based on a quantum dot, the device is an efficient source of photons that emerge as solo particles that are indistinguishable from each other. The researchers are now trying to use the source to create a quantum computer based on "boson sampling". [11] With the help of a semiconductor quantum dot, physicists at the University of Basel have developed a new type of light source that emits single photons. For the first time, the researchers have managed to create a stream of identical photons. [10] Optical photons would be ideal carriers to transfer quantum information over large distances. Researchers envisage a network where information is processed in certain nodes and transferred between them via photons. [9] While physicists are continually looking for ways to unify the theory of relativity, which describes large-scale phenomena, with quantum theory, which describes small-scale phenomena, computer scientists are searching for technologies to build the quantum computer using Quantum Information. In August 2013, the achievement of "fully deterministic" quantum teleportation, using a hybrid technique, was reported. On 29 May 2014, scientists announced a reliable way of transferring data by quantum teleportation. Quantum teleportation of data had been done before but with highly unreliable methods. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to build the Quantum Computer with the help of Quantum Information.

**Category:** Quantum Physics

[86] **viXra:1705.0104 [pdf]**
*submitted on 2017-05-04 12:18:15*

**Authors:** Ramesh Chandra Bagadi

**Comments:** 4 Pages.

In this research investigation, the author has presented a Recursive Past Equation based on the Ananda-Damyanthi Similarity Measure [1]. Also, in this research investigation, the author has presented a Recursive Future Equation based on the Ananda-Damyanthi Similarity Measure [1].

**Category:** Mathematical Physics

[85] **viXra:1705.0103 [pdf]**
*submitted on 2017-05-04 08:01:49*

**Authors:** Dmitri Martila

**Comments:** 7 Pages.

With all diversity of the Theoretical Physics, there is no problem to reconcile the Nature
with reality.

**Category:** Quantum Physics

[84] **viXra:1705.0102 [pdf]**
*submitted on 2017-05-04 10:00:19*

**Authors:** George Rajna

**Comments:** 23 Pages.

Weizmann Institute of Science researchers recently uncovered thousands of human genes that are expressed—copied out to make proteins—differently in the two sexes. [13] Leiden theoretical physicists have proven that DNA mechanics, in addition to genetic information in DNA, determines who we are. Helmut Schiessel and his group simulated many DNA sequences and found a correlation between mechanical cues and the way DNA is folded. They have published their results in PLoS One. [12] We model the electron clouds of nucleic acids in DNA as a chain of coupled quantum harmonic oscillators with dipole-dipole interaction between nearest neighbours resulting in a van der Waals type bonding. [11] Scientists have discovered a secret second code hiding within DNA which instructs cells on how genes are controlled. The amazing discovery is expected to open new doors to the diagnosis and treatment of diseases, according to a new study. [10] There is also connection between statistical physics and evolutionary biology, since the arrow of time is working in the biological evolution also. From the standpoint of physics, there is one essential difference between living things and inanimate clumps of carbon atoms: The former tend to be much better at capturing energy from their environment and dissipating that energy as heat. [8] This paper contains the review of quantum entanglement investigations in living systems, and in the quantum mechanically modeled photoactive prebiotic kernel systems. [7] The human body is a constant flux of thousands of chemical/biological interactions and processes connecting molecules, cells, organs, and fluids, throughout the brain, body, and nervous system. Up until recently it was thought that all these interactions operated in a linear sequence, passing on information much like a runner passing the baton to the next runner. However, the latest findings in quantum biology and biophysics have discovered that there is in fact a tremendous degree of coherence within all living systems. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to understand the Quantum Biology.

**Category:** Physics of Biology

[83] **viXra:1705.0101 [pdf]**
*replaced on 2017-08-14 09:56:23*

**Authors:** Jeff Yee

**Comments:** 30 pages

Three commonly used physics equations for energy are derived from a single equation that describes wave energy, linking the photon’s quantum energy (E=hf) with mass-energy (E=mc^2) and energy-momentum (E=pc) found in particles. Then, the energy equation for particles is further derived in this paper to describe the Coulomb force (F=kqq/r^2) and the universal gravitational force (F=Gmm/r^2). All of these equations are ultimately derived from one fundamental energy wave equation.

**Category:** High Energy Particle Physics

[82] **viXra:1705.0100 [pdf]**
*submitted on 2017-05-03 13:05:47*

**Authors:** Edgar Valdebenito

**Comments:** 3 Pages.

This note presents some double integrals for Euler-Mascheroni constant and related fractals.

**Category:** Number Theory

[81] **viXra:1705.0099 [pdf]**
*submitted on 2017-05-03 13:26:13*

**Authors:** George Rajna

**Comments:** 40 Pages.

A mysterious gamma-ray glow at the center of the Milky Way is most likely caused by pulsars – the incredibly dense, rapidly spinning cores of collapsed ancient stars that were up to 30 times more massive than the sun. [28] Further evidence of the existence of dark matter – the mysterious substance that is believed to hold the Universe together – has been produced by Cosmologists at Durham University. [27] Researchers at the University of Waterloo have been able to capture the first composite image of a dark matter bridge that connects galaxies together. [26] In an abandoned gold mine one mile beneath Lead, South Dakota, the cosmos quiets down enough to potentially hear the faint whispers of the universe's most elusive material—dark matter. [25] The PICO bubble chambers use temperature and sound to tune into dark matter particles. [24] A detection device designed and built at Yale is narrowing the search for dark matter in the form of axions, a theorized subatomic particle that may make up as much as 80% of the matter in the universe. [23] The race is on to build the most sensitive U.S.-based experiment designed to directly detect dark matter particles. Department of Energy officials have formally approved a key construction milestone that will propel the project toward its April 2020 goal for completion. [22] Scientists at the Center for Axion and Precision Physics Research (CAPP), within the Institute for Basic Science (IBS) have optimized some of the characteristics of a magnet to hunt for one possible component of dark matter called axion. [21] The first sighting of clustered dwarf galaxies bolsters a leading theory about how big galaxies such as our Milky Way are formed, and how dark matter binds them, researchers said Monday. [20] Scientists from The University of Manchester working on a revolutionary telescope project have harnessed the power of distributed computing from the UK's GridPP collaboration to tackle one of the Universe's biggest mysteries – the nature of dark matter and dark energy. [18]

**Category:** Astrophysics

[80] **viXra:1705.0098 [pdf]**
*submitted on 2017-05-03 13:47:48*

**Authors:** G. Healey, S. Zhao, D. Brooks

**Comments:** 6 Pages.

Given the speed and movement for pitches thrown by a set of pitchers, we develop a measure of pitcher similarity.

**Category:** Statistics

[79] **viXra:1705.0097 [pdf]**
*submitted on 2017-05-03 13:55:04*

**Authors:** G. Healey, S. Zhao, D. Brooks

**Comments:** 7 Pages.

Tables of the most similar pitcher matches for 2016.

**Category:** Statistics

[78] **viXra:1705.0096 [pdf]**
*submitted on 2017-05-03 20:30:54*

**Authors:** Jeffrey Joseph Wolynski

**Comments:** 1 Page.

It is outlined in this paper the discoveries (or lack thereof due to improper theory development based on false assumptions) that lead to astrophysical theories and mistakes. Major expansions to this timeline are expected, but this does not prevent initial addressing of the mistakes.

**Category:** History and Philosophy of Physics

[77] **viXra:1705.0095 [pdf]**
*submitted on 2017-05-03 22:01:08*

**Authors:** John Peel

**Comments:** 6 Pages. The sets at the end are important

Calculating certain aspects of geometry has been difficult. They have defied analytics. Here I propose a
method of analysing shape and space in terms of two variables (n,m).

**Category:** Mathematical Physics

[76] **viXra:1705.0094 [pdf]**
*submitted on 2017-05-04 04:17:51*

**Authors:** Shiyuan.Li

**Comments:** 7 Pages.

Rotation invariance and translate invariance have great values in image recognition. In this paper, we bring a new architecture in convolutional neural network (CNN) to achieve rotation invariance and translate invariance in 2-D symbol recognition. We can also get the position and orientation of the 2-D symbol by the network to achieve detection purpose for multiple non-overlap target. Human being have the ability look at an object by one glance and remember it, we also can use this architecture to achieve this one shot learning.

**Category:** Artificial Intelligence

[75] **viXra:1705.0093 [pdf]**
*replaced on 2017-07-05 00:57:58*

**Authors:** L. Martino

**Comments:** IET Electronics Letters, Volume 53, Issue 16, Pages: 1115-1117, 2017

Monte Carlo (MC) methods have become very popular in signal processing during the past decades. The adaptive rejection sampling (ARS) algorithms are well-known MC technique which draw efficiently independent samples from univariate target densities. The ARS schemes yield a sequence of proposal functions that converge toward the target, so that the probability of accepting a sample approaches one. However, sampling from the proposal pdf becomes more computationally demanding each time it is updated. We propose the Parsimonious Adaptive Rejection Sampling (PARS) method, where an efficient trade-off between acceptance rate and proposal complexity is obtained. Thus, the resulting algorithm is faster than the standard ARS approach.

**Category:** Statistics

[74] **viXra:1705.0092 [pdf]**
*submitted on 2017-05-04 05:49:31*

**Authors:** Clayton James Conway

**Comments:** 3 Pages.

Historic chronologies & king lists and dendrochronologies (tree rings) agree with two major cycles of (T) 6 orbits per 181 years and (V) 32 orbits per 1001 years where enormous changes occurred on the earth into the Common Era.
This proves the idea of a dead gravity (no charge) theory to be incorrect.
The age of the earth is just less than 100 million years.

**Category:** History and Philosophy of Physics

[73] **viXra:1705.0091 [pdf]**
*submitted on 2017-05-04 06:15:57*

**Authors:** Ramesh Chandra Bagadi

**Comments:** 2 Pages.

The author has presented a Recursive Future Equation

**Category:** Mathematical Physics

[72] **viXra:1705.0090 [pdf]**
*submitted on 2017-05-04 06:49:43*

**Authors:** George Rajna

**Comments:** 16 Pages.

The first experimental result has been published from the newly upgraded Continuous Electron Beam Accelerator Facility (CEBAF) at the U.S. Department of Energy's Thomas Jefferson National Accelerator Facility. The result demonstrates the feasibility of detecting a potential new form of matter to study why quarks are never found in isolation. [10] A team of scientists currently working at the Large Hadron Collider at the European Organization for Nuclear Research (CERN) announced that it has possibly discovered the existence of a particle integral to nature in a statement on Tuesday, Dec. 15, and again on Dec.16. [9] In 2012, a proposed observation of the Higgs boson was reported at the Large Hadron Collider in CERN. The observation has puzzled the physics community, as the mass of the observed particle, 125 GeV, looks lighter than the expected energy scale, about 1 TeV. [8] 'In the new run, because of the highest-ever energies available at the LHC, we might finally create dark matter in the laboratory,' says Daniela. 'If dark matter is the lightest SUSY particle than we might discover many other SUSY particles, since SUSY predicts that every Standard Model particle has a SUSY counterpart.' [7] The problem is that there are several things the Standard Model is unable to explain, for example the dark matter that makes up a large part of the universe. Many particle physicists are therefore working on the development of new, more comprehensive models. [6] They might seem quite different, but both the Higgs boson and dark matter particles may have some similarities. The Higgs boson is thought to be the particle that gives matter its mass. And in the same vein, dark matter is thought to account for much of the 'missing mass' in galaxies in the universe. It may be that these mass-giving particles have more in common than was thought. [5] The magnetic induction creates a negative electric field, causing an electromagnetic inertia responsible for the relativistic mass change; it is the mysterious Higgs Field giving mass to the particles. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate by the diffraction patterns. The accelerating charges explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the wave particle duality and the electron's spin also, building the bridge between the Classical and Relativistic Quantum Theories. The self maintained electric potential of the accelerating charges equivalent with the General Relativity space-time curvature, and since it is true on the quantum level also, gives the base of the Quantum Gravity.

**Category:** High Energy Particle Physics

[71] **viXra:1705.0089 [pdf]**
*replaced on 2017-08-19 04:17:04*

**Authors:** Erman ZENG

**Comments:** 15 Pages.

The mathematical characterization of “the Productive Forces” of a macro economic system is based upon the analogy between political economy and Newtonian mechanics, which is expressed as the product of the growth rate of the profit rate (p) and the surplus value (M), showing several quantum qualities like a photon quanta. The one-dimensional linear harmonic oscillator model can correlate the angular frequency with the change rate of the rate of profit thus with the economic growth rate, resulting the quantum-like interpretation of various business cycles. The matrix operator analysis of the Leontief’s input-output table, similar to the matrix mechanics of quantum physics, gives the Schrodinger function like value-price transformation eigen function, with the reduced organic composite of capital as the eigenvalue of the price wave function, namely the relations of production, leading to the "two Cambridge controversy" resolved. The statistic physical entropy increase theory combined with the Marx labor value function leads to the quantitative formulation of the relations of production.

**Category:** Economics and Finance

[70] **viXra:1705.0088 [pdf]**
*submitted on 2017-05-04 00:59:50*

**Authors:** Memet Sahin, Necati Olgun, Vakkas Uluçay, Abdullah Kargın, Florentin Smarandache

**Comments:** 18 Pages.

In this paper, we propose some transformations based on the centroid points between single valued neutrosophic numbers. We introduce these transformations according to truth, indeterminacy and falsity value of single valued neutrosophic numbers. We propose a new similarity measure based on falsity value between single valued neutrosophic sets.

**Category:** General Mathematics

[69] **viXra:1705.0086 [pdf]**
*submitted on 2017-05-04 01:02:46*

**Authors:** Surapati Pramanik, Partha Pratim Dey, Bibhas C. Giri, Florentin Smarandache

**Comments:** 10 Pages.

Bipolar neutrosophic sets are the extension of neutrosophic sets and are based on the idea of positive and negative preferences of information. Projection measure is a useful apparatus for modelling real life decision making problems. In the paper, we define projection, bidirectional projection and hybrid projection measures between bipolar neutrosophic sets.

**Category:** General Mathematics

[68] **viXra:1705.0084 [pdf]**
*submitted on 2017-05-04 01:05:13*

**Authors:** Durga Banerjee, Bibhas C. Giri, Surapati Pramanik, Florentin Smarandache

**Comments:** 10 Pages.

In this paper, multi attribute decision making problem based on grey relational analysis in neutrosophic cubic set environment is investigated. In the decision making situation, the attribute weights are considered as single valued neutrosophic sets. The neutrosophic weights are converted into crisp weights. Both positve and negative GRA coefficients, and weighted GRA coefficients are determined.

**Category:** General Mathematics

[67] **viXra:1705.0083 [pdf]**
*submitted on 2017-05-04 01:06:12*

**Authors:** Florentin Smarandache

**Comments:** 6 Pages.

During the process of adaptation of a being (plant, animal, or human), to a new environment or conditions, the being partially evolves, partially devolves (degenerates), and partially is indeterminate i.e. neither evolving nor devolving, therefore unchanged (neutral), or the change is unclear, ambiguous, vague, as in neutrosophic logic.

**Category:** General Mathematics

[66] **viXra:1705.0082 [pdf]**
*submitted on 2017-05-04 01:07:43*

**Authors:** Muhammad Aslam Malik, Ali Hassan, Said Broumi, Assia Bakali, Mohamed Talea, Florentin Smarandache

**Comments:** 24 Pages.

In this paper, we introduce the homomorphism, the weak isomorphism, the co-weak isomorphism, and the isomorphism of the bipolar single valued neutrosophic hypergraphs. The properties of order, size and degree of vertices are discussed. The equivalence relation of the isomorphism of the bipolar single valued neutrosophic hypergraphs and the weak isomorphism of bipolar single valued neutrosophic hypergraphs, together with their partial order relation, is also verified.

**Category:** General Mathematics

[65] **viXra:1705.0081 [pdf]**
*submitted on 2017-05-04 01:08:50*

**Authors:** Muhammad Aslam Malik, Ali Hassan, Said Broumi, Assia Bakali, Mohamed Talea, Florentin Smarandache

**Comments:** 26 Pages.

In this paper, we introduce the homomorphism, weak isomorphism, co-weak isomorphism and isomorphism of interval valued neutrosophic hypergraphs. The properties of order, size and degree of vertices, along with isomorphism, are included. The isomorphism of interval valued neutrosophic hypergraphs equivalence relation and weak isomorphism of interval valued neutrosophic hypergraphs partial order relation are also verified.

**Category:** General Mathematics

[64] **viXra:1705.0080 [pdf]**
*submitted on 2017-05-04 01:10:24*

**Authors:** Muhammad Aslam Malik, Ali Hassan, Said Broumi, Assia Bakali, Mohamed Talea, Florentin Smarandache

**Comments:** 22 Pages.

In this paper, we introduce the homomorphism, weak isomorphism, co-weak isomorphism, and isomorphism of single valued neutrosophic hypergraphs. The properties of order, size and degree of vertices, along with isomorphism, are included. The isomorphism of single valued neutrosophic hypergraphs equivalence relation and of weak isomorphism of single valued neutrosophic hypergraphs partial order relation is also verified.

**Category:** General Mathematics

[63] **viXra:1705.0079 [pdf]**
*submitted on 2017-05-04 01:11:52*

**Authors:** W.B. Vasantha Kandasamy, K. Ilanthenral, Florentin Smarandache

**Comments:** 3 Pages.

The Collatz conjecture was proposed by Lothar Collatz in 1937. Till date this conjecture remains open.

**Category:** General Mathematics

[62] **viXra:1705.0078 [pdf]**
*submitted on 2017-05-04 01:13:10*

**Authors:** Kalyan Mondal, Surapati Pramanik, Florentin Smarandache

**Comments:** 15 Pages.

The purpose of this study is to propose new similarity measures namely rough variational coefficient similarity measure under the rough neutrosophic environment. The weighted rough variational coefficient similarity measure has been also defined.

**Category:** General Mathematics

[61] **viXra:1705.0077 [pdf]**
*submitted on 2017-05-04 01:14:19*

**Authors:** Dragisa Stanujkic, Florentin Smarandache, Edmundas Kazimieras Zavadskas, Darjan Karabasevic

**Comments:** 4 Pages.

Gathering the attitudes of the examined respondents would be very significant in some evaluation models. Therefore, a multiple criteria approach based on the use of the neutrosophic set is considered in this paper.

**Category:** General Mathematics

[60] **viXra:1705.0076 [pdf]**
*submitted on 2017-05-04 01:15:33*

**Authors:** Mai Mohamed, Mohamed Abdel-Basset, Abdel Nasser H Zaied, Florentin Smarandache

**Comments:** 5 Pages.

In this paper, we introduce the integer programming in neutrosophic environment, by considering coffecients of problem as a triangulare neutrosophic numbers. The degrees of acceptance, indeterminacy and rejection of objectives are simultaneously considered.

**Category:** General Mathematics

[59] **viXra:1705.0072 [pdf]**
*submitted on 2017-05-04 01:21:37*

**Authors:** Ali Hassan, Muhammad Aslam Malik, Florentin Smarandache

**Comments:** 14 Pages.

In this paper, we define the regular and the totally regular interval valued neutrosophic hypergraphs, and discuss the order and size along with properties of the regular and the totally regular single valued neutrosophic hypergraphs. We extend work to completeness of interval valued neutrosophic hypergraphs.

**Category:** General Mathematics

[58] **viXra:1705.0071 [pdf]**
*submitted on 2017-05-04 01:23:01*

**Authors:** Muhammad Aslam Malik, Ali Hassan, Said Broumi, F. Smarandache

**Comments:** 6 Pages.

In this paper, we define the regular and totally regular bipolar single valued neutrosophic hypergraphs, and discuss the order and size along with properties of regular and totally regular bipolar single valued neutrosophic hypergraphs. We extend work on completeness of bipolar single valued neutrosophic hypergraphs.

**Category:** General Mathematics

[57] **viXra:1705.0070 [pdf]**
*submitted on 2017-05-04 01:24:41*

**Authors:** Muhammad Aslam Malik, Ali Hassan, Said Broumi, Florentin Smarandache

**Comments:** 6 Pages.

In this paper, we define the regular and totally regular single valued neutrosophic hypergraphs, and discuss the order and size along with properties of regular and totally regular single valued neutrosophic hypergraphs. We also extend work on completeness of single valued neutrosophic hypergraphs.

**Category:** General Mathematics

[56] **viXra:1705.0069 [pdf]**
*submitted on 2017-05-04 01:25:54*

**Authors:** Kalyan Mondal, Surapati Pramanik, Florentin Smarandache

**Comments:** 16 Pages.

This paper presents multi-attribute decision making based on rough neutrosophic hyper-complex sets with rough neutrosophic hyper-complex attribute values. The concept of neutrosophic hyper-complex set is a powerful mathematical tool to deal with incomplete, indeterminate and inconsistent information. We extend the concept of neutrosophic hyper-complex set to rough neutrosophic hyper-complex environ-ment. The ratings of all alternatives are expressed in terms of the upper / lower approximations and pairs of neutrosophic hyper-complex sets, which are characterized by two hyper-complex functions and an indeterminacy component.

**Category:** General Mathematics

[55] **viXra:1705.0068 [pdf]**
*submitted on 2017-05-04 01:27:01*

**Authors:** Kalyan Mondal, Surapati Pramanik, Florentin Smarandache

**Comments:** 13 Pages.

This paper is devoted to present Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) method for multi-attribute group decision making under rough neutrosophic environment. The concept of rough neutrosophic set is a powerful mathematical tool to deal with uncertainty, indeterminacy and inconsistency.

**Category:** General Mathematics

[54] **viXra:1705.0062 [pdf]**
*submitted on 2017-05-04 01:33:21*

**Authors:** Florentin Smarandache

**Comments:** 8 Pages.

In this paper, we define the subtraction and the division of neutrosophic single-valued numbers. The restrictions for these operations are presented for neutrosophic singlevalued numbers and neutrosophic single-valued overnumbers / undernumbers / offnumbers. Afterwards, several numeral examples are presented.

**Category:** General Mathematics

[53] **viXra:1705.0059 [pdf]**
*submitted on 2017-05-04 01:43:17*

**Authors:** Mona Gamal Gafar, Ibrahim El-Henawy

**Comments:** 10 Pages.

Uncertainty and indeterminacy are two major problems in data analysis these days. Neutrosophy is a generalization of the fuzzy theory. Neutrosophic system is based on indeterminism and falsity of concepts in addition to truth degrees.

**Category:** General Mathematics

[52] **viXra:1705.0058 [pdf]**
*submitted on 2017-05-04 01:44:57*

**Authors:** Rajashi Chatterjee, Pinaki Majumdar, Syamal Kumar Samanta

**Comments:** 9 Pages.

The theory of quadripartitioned single valued neutrosophic sets was proposed very recently as an extension to the existing theory of single valued neutrosophic sets.

**Category:** General Mathematics

[51] **viXra:1705.0057 [pdf]**
*submitted on 2017-05-04 01:46:22*

**Authors:** Tuhin Bera, Nirmal Kumar Mahapatra

**Comments:** 10 Pages.

The notion of neutrosophic soft group is introduced, together with several related properties. Its structural characteristics are investigated with suitable examples.

**Category:** General Mathematics

[50] **viXra:1705.0056 [pdf]**
*submitted on 2017-05-04 01:48:04*

**Authors:** Nasir Shah, Said Broumi

**Comments:** 9 Pages.

The concepts of neighbourly irregular neutrosophic graphs, neighbourly totally irregular neutrosophic graphs, highly irregular neutrosophic graphs and highly totally irregular neutrosophic graphs are introduced. A criteria for neighbourly irregular and highly irregular neutrosophic graphs to be equivalent is discussed.

**Category:** General Mathematics

[49] **viXra:1705.0055 [pdf]**
*submitted on 2017-05-04 01:50:31*

**Authors:** Pablo José Menéndez Vera, Cristhian Fabián Menéndez Delgado, Miriam Peña Gónzalez, Maikel Leyva Vázquez

**Comments:** 9 Pages.

In Ecuador, specifically in the Yaguachi Canton, there is an enormous potential in the rice production, which unfortunately is not being well used and driven by marketing strategies. In this work, marketing strategies were developed that help to sustain the commercial activity of rice in Yaguachi Canton and its surroundings.

**Category:** General Mathematics

[48] **viXra:1705.0054 [pdf]**
*submitted on 2017-05-04 01:51:47*

**Authors:** Tanushree Mitra Basu, Shyamal Kumar Mondal

**Comments:** 11 Pages.

In this paper, we have introduced a new concept of multi-dimensional neutrosophic soft sets together with various operations, properties and theorems on them.

**Category:** General Mathematics

[47] **viXra:1705.0053 [pdf]**
*submitted on 2017-05-04 01:53:03*

**Authors:** Mridula Sarkar, Samir Dey, Tapan Kumar Roy

**Comments:** 10 Pages.

This paper develops a multi-objective Neutrosophic Goal Optimization (NSGO) technique for optimizing the design of three bar truss structure with multiple objectives subject to a specified set of constraints. In this optimum design formulation, the objective functions are weight and deflection; the design variables are the cross-sections of the bar; the constraints are the stress in member.

**Category:** General Mathematics

[46] **viXra:1705.0052 [pdf]**
*submitted on 2017-05-04 01:54:28*

**Authors:** Wadei Al-Omeri

**Comments:** 9 Pages.

In this paper, the structure of some classes of neutrosophic crisp nearly open sets are investigated via topology, and some applications are given. Finally, we generalize the crisp topological and neutrosophic crisp studies to the notion of neutrosophic crisp set.

**Category:** General Mathematics

[45] **viXra:1705.0051 [pdf]**
*submitted on 2017-05-04 02:05:20*

**Authors:** Rakib Iqbal, Sohail Zafar, Muhammad Shoaib Sardar

**Comments:** 14 Pages.

The objective of this paper is to introduced the concept of neutrosophic cubic set to subalgebras, ideals and closed ideals of B-algebra. Links among neutrosophic cubic subalgebra with neutrosophic cubic ideals and neutrosophic closed ideals of B-algebras as well as some related properties will be investigated.

**Category:** General Mathematics

[44] **viXra:1705.0049 [pdf]**
*submitted on 2017-05-04 02:07:35*

**Authors:** T.Chalapathi, R. V M S S Kiran Kumar

**Comments:** 9 Pages.

Most of the real world problems in the fields of philosophy, physics, statistics, finance, robotics, design theory, coding theory, knot theory, engineering, and information science contain subtle uncertainty and inconsistent, which causes complexity and difficulty in solving these problems.

**Category:** General Mathematics

[43] **viXra:1705.0048 [pdf]**
*submitted on 2017-05-04 02:09:32*

**Authors:** Serkan Karatas, Cemil Kuru

**Comments:** 6 Pages.

In this paper, we redefine the neutrosophic set operations and, by using them, we introduce neutrosophic topology and investigate some related properties such as neutrosophic closure, neutrosophic interior, neutrosophic exterior, neutrosophic boundary and neutrosophic subspace.

**Category:** General Mathematics

[42] **viXra:1705.0047 [pdf]**
*submitted on 2017-05-04 02:10:33*

**Authors:** Harish Garg, Nancy

**Comments:** 8 Pages.

Entropy is one of the measures which is used for measuring the fuzziness of the set. In this article, we have presented an entropy measure of order α under the single-valued neutrosophic set environment by considering the pair of their membership functions as well as the hesitation degree between them

**Category:** General Mathematics

[41] **viXra:1705.0046 [pdf]**
*submitted on 2017-05-04 02:11:38*

**Authors:** Wenzhong Jiang, Jun Ye

**Comments:** 5 Pages.

This paper defines basic operations of neutrosophic numbers and neutrosophic number functions for objective functions and constraints in optimization models. Then, we propose a general neutrosophic number optimization model for the optimal design of truss structures.

**Category:** General Mathematics

[40] **viXra:1705.0045 [pdf]**
*submitted on 2017-05-04 02:13:04*

**Authors:** Naga Raju I, Rajeswara Reddy P, Diwakar Reddy V, Krishnaiah G.

**Comments:** 9 Pages.

In real life scientific and engineering problems decision making is common practice. Decision making include single decision maker or group of decision makers. Decision maker’s expressions consists imprecise, inconsistent and indeterminate information.

**Category:** General Mathematics

[39] **viXra:1705.0044 [pdf]**
*submitted on 2017-05-04 02:14:35*

**Authors:** Pablo José Menéndez Vera, Cristhian Fabián Menéndez Delgado, Susana Paola Carrillo Vera, Milton Villegas Alava, Miriam Peña Gónzales

**Comments:** 4 Pages.

Static analysis is developed in neutrosophic cognitive maps to define the importance of each node based on centrality measures. In this paper a framework static analysis of neutrosophic cognitive maps is presented.

**Category:** General Mathematics

[38] **viXra:1705.0043 [pdf]**
*submitted on 2017-05-04 02:16:06*

**Authors:** Kul Hur, Pyung Ki Lim, Jeong Gon Lee, Junhui Kim

**Comments:** 9 Pages.

We introduce the category NSet(H) consisting of neutrosophic H-sets and morphisms between them. And we study NSet(H) in the sense of a topological universe and prove that it is Cartesian closed over Set, where Set denotes the category con
sisting of ordinary sets and ordinary mappings between them. Furthermore, we investigate some relationships between two categories ISet(H) and NSet(H).

**Category:** General Mathematics

[37] **viXra:1705.0042 [pdf]**
*submitted on 2017-05-04 02:17:47*

**Authors:** A.A. Salama, Hewayda, ElGhawalby, Shimaa Fathi Ali

**Comments:** 4 Pages.

In this paper, we introduce and study a neutrosophic crisp manifold as a new topological structure of manifold via neutrosophic crisp set. Therefore, we study some new topological concepts and some metric distances on a neutrosophic crisp manifold.

**Category:** General Mathematics

[36] **viXra:1705.0041 [pdf]**
*submitted on 2017-05-04 02:19:06*

**Authors:** Sujit Kumar De, Ismat Beg

**Comments:** 14 Pages.

In this study, we introduce the concept of denser property in fuzzy membership function used in neutrosophic sets. We present several new definitions and study their properties.

**Category:** General Mathematics

[35] **viXra:1705.0040 [pdf]**
*submitted on 2017-05-04 02:20:20*

**Authors:** Mridula Sarkar, Samir Dey, Tapan Kumar Roy

**Comments:** 8 Pages.

In this paper, we develop a neutrosophic optimization (NSO) approach for optimizing the design of plane truss structure with single objective subject to a specified set of constraints. In this optimum design formulation, the objective functions are the weight of the truss and the deflection of loaded joint; the design variables are the crosssections of the truss members; the constraints are the stresses in members.

**Category:** General Mathematics

[34] **viXra:1705.0039 [pdf]**
*replaced on 2017-05-04 20:08:30*

**Authors:** Radhakrishnamurty Padyala

**Comments:** 5 Pages.

Irreversible adiabatic cyclic process of an ideal gas is an important thermodynamic process. It offers a method of analysis of second law without involving any heat interactions. We show in this note that the impossibility of an irreversible adiabatic cyclic process is equivalent to the assertion that time plays no role in thermodynamic predictions.

**Category:** Thermodynamics and Energy

[33] **viXra:1705.0038 [pdf]**
*submitted on 2017-05-04 03:34:37*

**Authors:** George Rajna

**Comments:** 25 Pages.

Researchers at Sandia National Laboratories have developed new mathematical techniques to advance the study of molecules at the quantum level. [19] Correlation functions are often employed to quantify the relationships among interdependent variables or sets of data. A few years ago, two researchers proposed a property-testing problem involving Forrelation for studying the query complexity of quantum devices. [18] A team of researchers from Australia and the UK have developed a new theoretical framework to identify computations that occupy the 'quantum frontier'—the boundary at which problems become impossible for today's computers and can only be solved by a quantum computer. [17] Scientists at the University of Sussex have invented a groundbreaking new method that puts the construction of large-scale quantum computers within reach of current technology. [16] Physicists at the University of Bath have developed a technique to more reliably produce single photons that can be imprinted with quantum information. [15] Now a researcher and his team at Tyndall National Institute in Cork have made a 'quantum leap' by developing a technical step that could enable the use of quantum computers sooner than expected. [14] A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy's Oak Ridge National Laboratory. [13] A source of single photons that meets three important criteria for use in quantum-information systems has been unveiled in China by an international team of physicists. Based on a quantum dot, the device is an efficient source of photons that emerge as solo particles that are indistinguishable from each other. The researchers are now trying to use the source to create a quantum computer based on "boson sampling". [11] With the help of a semiconductor quantum dot, physicists at the University of Basel have developed a new type of light source that emits single photons. For the first time, the researchers have managed to create a stream of identical photons. [10] Optical photons would be ideal carriers to transfer quantum information over large distances. Researchers envisage a network where information is processed in certain nodes and transferred between them via photons. [9] While physicists are continually looking for ways to unify the theory of relativity, which describes large-scale phenomena, with quantum theory, which describes small-scale phenomena, computer scientists are searching for technologies to build the quantum computer using Quantum Information. In August 2013, the achievement of "fully deterministic" quantum teleportation, using a hybrid technique, was reported. On 29 May 2014, scientists announced a reliable way of transferring data by quantum teleportation. Quantum teleportation of data had been done before but with highly unreliable methods. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to build the Quantum Computer with the help of Quantum Information.

**Category:** Quantum Physics

[32] **viXra:1705.0037 [pdf]**
*submitted on 2017-05-03 11:21:26*

**Authors:** Ramesh Chandra Bagadi

**Comments:** 2 Pages.

In this research investigation, the author has presented a Recursive Past Equation. Also, a Recursive Future Equation is presented.

**Category:** Statistics

[31] **viXra:1705.0036 [pdf]**
*submitted on 2017-05-03 08:57:59*

**Authors:** George Rajna

**Comments:** 24 Pages.

A team of researchers at Sandia Labs in the U.S. has developed a type of atom interferometer that does not require super-cooled temperatures. [15] By taking advantage of a phenomenon known as "quantum mechanical squeezing," researchers have conceptually designed a new method of applying atomic force microscopy. [14] In modern physics of the past century, understanding the electronic properties and interactions between electrons inside matter has been a major challenge. [13] An international team of researchers have found evidence of a mysterious new state of matter, first predicted 40 years ago, in a real material. This state, known as a quantum spin liquid, causes electrons-thought to be indivisible building blocks of nature-to break into pieces. [12] In a single particle system, the behavior of the particle is well understood by solving the Schrödinger equation. Here the particle possesses wave nature characterized by the de Broglie wave length. In a many particle system, on the other hand, the particles interact each other in a quantum mechanical way and behave as if they are "liquid". This is called quantum liquid whose properties are very different from that of the single particle case. [11] Quantum coherence and quantum entanglement are two landmark features of quantum physics, and now physicists have demonstrated that the two phenomena are "operationally equivalent"—that is, equivalent for all practical purposes, though still conceptually distinct. This finding allows physicists to apply decades of research on entanglement to the more fundamental but less-well-researched concept of coherence, offering the possibility of advancing a wide range of quantum technologies. [10] The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** Quantum Physics

[30] **viXra:1705.0033 [pdf]**
*submitted on 2017-05-03 06:34:14*

**Authors:** George Rajna

**Comments:** 23 Pages.

By taking advantage of a phenomenon known as "quantum mechanical squeezing," researchers have conceptually designed a new method of applying atomic force microscopy. [14] In modern physics of the past century, understanding the electronic properties and interactions between electrons inside matter has been a major challenge. [13] An international team of researchers have found evidence of a mysterious new state of matter, first predicted 40 years ago, in a real material. This state, known as a quantum spin liquid, causes electrons-thought to be indivisible building blocks of nature-to break into pieces. [12] In a single particle system, the behavior of the particle is well understood by solving the Schrödinger equation. Here the particle possesses wave nature characterized by the de Broglie wave length. In a many particle system, on the other hand, the particles interact each other in a quantum mechanical way and behave as if they are "liquid". This is called quantum liquid whose properties are very different from that of the single particle case. [11] Quantum coherence and quantum entanglement are two landmark features of quantum physics, and now physicists have demonstrated that the two phenomena are "operationally equivalent"—that is, equivalent for all practical purposes, though still conceptually distinct. This finding allows physicists to apply decades of research on entanglement to the more fundamental but less-well-researched concept of coherence, offering the possibility of advancing a wide range of quantum technologies. [10] The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** Quantum Physics

[29] **viXra:1705.0032 [pdf]**
*submitted on 2017-05-02 13:21:02*

**Authors:** Omar Abu Arqub, Shaher Momani, Ma'mon Abu Hammad, Ahmed Alsaedi

**Comments:** 13 Pages.

In this article, a residual power series technique for the power series solution of systems of initial value problems is introduced. The new approach provides the solution in the form of a rapidly convergent series with easily computable components using symbolic computation software. The proposed technique obtains Taylor expansion of the solution of a system and reproduces the exact solution when the solution is polynomial. Numerical examples are included to demonstrate the efficiency, accuracy, and applicability of the presented technique. The results reveal that the technique is very effective, straightforward, and simple.

**Category:** General Mathematics

[28] **viXra:1705.0031 [pdf]**
*submitted on 2017-05-02 14:21:56*

**Authors:** Terubumi Honjo

**Comments:** 5 Pages.

The orbit of the satellite is drawn in the direction of the sun and shifted from the calculated value.
The mystery is called the Pioneer Anomaly, and the mystery has not been solved for thirty years.
「Wikipedia」 article.
Every possibility has been investigated but has yet to be settled.
The power to the satellites is usually calculated as 1/100 billion of gravity.
Previously, when the mystery of the coming of the mercury orbit was not solved,
It was the first time that the general theory of relativity was born and was understood as a distortion of the space by gravity.
Similarly, it is forecast that it is a mystery solved for the first time by an unknown theory.
This will prove the correctness of the theory by becoming a powerful verification experiment of general relativity theory.
In the same way,
If the answer to "Pioneer anomaly" by the principle of particle pulsation is truth,
It is a
It becomes a powerful verification experiment to the gravity action that the particle pulsation principle model shows.
It will demonstrate the correctness of the theory.

**Category:** Astrophysics

[27] **viXra:1705.0030 [pdf]**
*submitted on 2017-05-02 14:34:44*

**Authors:** Yanming Wei

**Comments:** 5 Pages. DOI: 10.13140/RG.2.2.28522.72648

Why we have to be addictive to rotary turbine for tide or wind energy harvest? Perhaps we are not smart enough to find a new way. Now I propose a rectangular cross section turbine that works in reciprocation mode to harvest energy from any flowing fluid. In a sense, fluid flows in similar way of electric DC (Direct Current), but reciprocal motion of device’s ram behaves in similar way of AC (Alternating Current), thus a DC-AC mechanic inverter is needed. Of course, inverse utilization of same mechanism renders an AC-DC mechanic rectifier, i.e. an exotic pump.

**Category:** Thermodynamics and Energy

[26] **viXra:1705.0029 [pdf]**
*submitted on 2017-05-02 14:43:50*

**Authors:** Gerges Francis Tawdrous

**Comments:** 51 Pages.

The Solar System Geometry Series considers the Solar Group as One Machine cooperates together
In this Part (No.3) we discuss The Planet Diameter Creation And Job
How the Solar Planets Geometry define specific Diameter for specific Planet
And what will happen if the planet diameter changes…?
And How the Sun Diameter is Created?
Through The Planet Diameter Study We Discuss The Matter Origin And Definition..

**Category:** Astrophysics

[25] **viXra:1705.0028 [pdf]**
*submitted on 2017-05-02 15:33:07*

**Authors:** Morad Ahmad, Shaher Momani, Omar Abu Arqub, Mohammed Al-Smadi, Ahmed Alsaedi

**Comments:** 13 Pages.

In this paper, a powerful computational algorithm is developed for the solution of classes of singular second-order, three-point Volterra integrodifferential equations in favorable reproducing kernel Hilbert spaces. The solutions is represented in the form of series in the Hilbert space W₂³[0,1] with easily computable components. In finding the computational solutions, we use generating the orthogonal basis from the obtained kernel functions such that the orthonormal basis is constructing in order to formulate and utilize the solutions. Numerical experiments are carried where two smooth reproducing kernel functions are used throughout the evolution of the algorithm to obtain the required nodal values of the unknown variables. Error estimates are proven that it converge to zero in the sense of the space norm. Several computational simulation experiments are given to show the good performance of the proposed procedure. Finally, the utilized results show that the present algorithm and simulated annealing provide a good scheduling methodology to multipoint singular boundary value problems restricted by Volterra operator.

**Category:** Functions and Analysis

[24] **viXra:1705.0027 [pdf]**
*submitted on 2017-05-02 21:38:43*

**Authors:** Murat Arslan

**Comments:** 116 Pages.

In this thesis, obstacle detection via image of objects and then pathfinding problems of
NAO humanoid robot is considered. NAO's camera is used to capture the images of world
map. The captured image is processed and classified into two classes; area with obstacles
and area without obstacles. For classification of images, Support Vector Machine (SVM) is
used. After classification the map of world is obtained as area with obstacles and area
without obstacles. This map is input for path finding algorithm. In the thesis A* path
finding algorithm is used to find path from the start point to the goal.
The aim of this work is to implement a support vector machine based solution to robot
guidance problem, visual path planning and obstacle avoidance. The used algorithms allow
to detect obstacles and find an optimal path. The thesis describe basic steps of navigation
of mobile robots.

**Category:** Artificial Intelligence

[23] **viXra:1705.0026 [pdf]**
*submitted on 2017-05-03 03:48:26*

**Authors:** Leo Vuyk

**Comments:** 37 Pages.

In Quantum Function Follows Form Theory, ( Q-FFF Theory) the Big Bang was the evaporation and splitting of a former Big Crunch black hole nucleus of compressed massless Axion Higgs particles.
The Big Bang Nucleus of compressed massless Axion /Higgs particles is assumed to be split into respectively chunky nuclei of dark matter black holes called plasma /matter creating Quasars, or evaporated as singular massless Axion Higgs vacuum particles oscillating along a tetrahedral shaped chiral vacuum lattice. .
The vacuum Lattice is supposed to represent a dynamic reference frame with variable local length and so called Dark Energy or Zero Point Energy acting as the motor for all Fermion spin and eigen energy and as the transfer medium for all photon information, leading to local lightspeed and local time.
The energetic oscillating vacuum lattice is assumed to act as a Gravity Quantum Dipole Repeller (or large scale Casimir effect)
Gravitons are not supposed to attract- but repel Fermions with less impulse than vacuum particle pressure does. So, gravity is assumed to be a dual pressure process on fermions.
As a consequence, Feynman diagrams become more complex than before..
Recent measurements by Yehuda Hoffman et al. did show the repelling effect of “empty space” in opposition of the “attracting gravity effect” of super clusters which he called “Dipole Repeller” effect.
The universe is supposed to be Holographic by the instant long distance entanglement particle guidance between Charge Parity Symmetric Universes; the Multiverse.

**Category:** Astrophysics

[22] **viXra:1705.0025 [pdf]**
*submitted on 2017-05-02 08:06:33*

**Authors:** George Rajna

**Comments:** 19 Pages.

A team of researchers working on the CERN Axion Solar Telescope (CAST) project report passing an important milestone in their search for the axion— they have moved below established astrophysical constraints and are now working in an area that is expected to reap many rewards regarding both the axion and other avenues of physics research. [19] If the axion exist and it is the main component of Dark Matter, the very relic axions that would be bombarding us continuously could be detected using microwave resonant (to the axion mass) cavities, immersed in powerful magnetic fields. [18] In yet another attempt to nail down the elusive nature of dark matter, a European team of researchers has used a supercomputer to develop a profile of the yet-to-be-detected entity that appears to pervade the universe. [17] MIT physicists are proposing a new experiment to detect a dark matter particle called the axion. If successful, the effort could crack one of the most perplexing unsolved mysteries in particle physics, as well as finally yield a glimpse of dark matter. [16] Researches at Stockholm University are getting closer to light dark-matter particle models. Observations rule out some axion-like particles in the quest for the content of dark matter. The article is now published in the Physical Review Letters. [15] Scientists have detected a mysterious X-ray signal that could be caused by dark matter streaming out of our Sun's core. Hidden photons are predicted in some extensions of the Standard Model of particle physics, and unlike WIMPs they would interact electromagnetically with normal matter. In particle physics and astrophysics, weakly interacting massive particles, or WIMPs, are among the leading hypothetical particle physics candidates for dark matter. The gravitational force attracting the matter, causing concentration of the matter in a small space and leaving much space with low matter concentration: dark matter and energy. There is an asymmetry between the mass of the electric charges, for example proton and electron, can understood by the asymmetrical Planck Distribution Law. This temperature dependent energy distribution is asymmetric around the maximum intensity, where the annihilation of matter and antimatter is a high probability event. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** Astrophysics

[21] **viXra:1705.0024 [pdf]**
*submitted on 2017-05-02 08:30:38*

**Authors:** George Rajna

**Comments:** 24 Pages.

By replacing the phosphor screen in a laser phosphor display (LPD) with a luminescent solar concentrator (LSC), one can harvest energy from ambient light as well as display high-resolution images. [34] A team of researchers from Japan reports this week in Applied Physics Letters, that they have discovered a phenomenon called the photodielectric effect, which could lead to laser-controlled touch displays. [33] Researchers from the ARC Centre for Ultrahigh bandwidth Devices for Optical Systems (CUDOS) in the University of Sydney's Australian Institute for Nanoscale Science and Technology have made a breakthrough achieving radio frequency signal control at sub-nanosecond time scales on a chip-scale optical device. [32] The shrinking of electronic components and the excessive heat generated by their increasing power has heightened the need for chip-cooling solutions, according to a Rutgers-led study published recently in Proceedings of the National Academy of Sciences. Using graphene combined with a boron nitride crystal substrate, the researchers demonstrated a more powerful and efficient cooling mechanism. [31] Materials like graphene can exhibit a particular type of large-amplitude, stable vibrational modes that are localised, referred to as Discrete Breathers (DBs). [30] A two-dimensional material developed by Bayreuth physicist Prof. Dr. Axel Enders together with international partners could revolutionize electronics. [29] Researchers have found a way to trigger the innate, but previously hidden, ability of graphene to act as a superconductor-meaning that it can be made to carry an electrical current with zero resistance. [28] Researchers in Japan have found a way to make the 'wonder material' graphene superconductive-which means electricity can flow through it with zero resistance. The new property adds to graphene's already impressive list of attributes, like the fact that it's stronger than steel, harder than diamond, and incredibly flexible. [27] Superconductivity is a rare physical state in which matter is able to conduct electricity—maintain a flow of electrons—without any resistance. It can only be found in certain materials, and even then it can only be achieved under controlled conditions of low temperatures and high pressures. New research from a team including Carnegie's Elissaios Stavrou, Xiao-Jia Chen, and Alexander Goncharov hones in on the structural changes underlying superconductivity in iron arsenide compounds—those containing iron and arsenic. [26] This paper explains the magnetic effect of the superconductive current from the observed effects of the accelerating electrons, causing naturally the experienced changes of the electric field potential along the electric wire. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the wave particle duality and the electron's spin also, building the bridge between the Classical and Quantum Theories. The changing acceleration of the electrons explains the created negative electric field of the magnetic induction, the Higgs Field, the changing Relativistic Mass and the Gravitational Force, giving a Unified Theory of the physical forces. Taking into account the Planck Distribution Law of the electromagnetic oscillators also, we can explain the electron/proton mass rate and the Weak and Strong Interactions.

**Category:** Quantum Physics

[20] **viXra:1705.0023 [pdf]**
*submitted on 2017-05-02 08:45:41*

**Authors:** P. R. Silva

**Comments:** 06 pages and 12 references

The Holographic Principle is applied in 3-D and 2-D universes described by volume and surface of a sphere having a radius in the scale of the cosmological constant. Making the equality between the degrees of freedom furnished by the holographic description of them, we find the radius of the observable universe. While doing this, weak interaction coupling and quantum gravity play their role.

**Category:** Quantum Gravity and String Theory

[19] **viXra:1705.0022 [pdf]**
*submitted on 2017-05-02 09:06:08*

**Authors:** George Rajna

**Comments:** 41 Pages.

Neuralink – which is "developing ultra high bandwidth brain-machine interfaces to connect humans and computers" – is probably a bad idea. If you understand the science behind it, and that's what you wanted to hear, you can stop reading. [26]
But now there is a technology that enables us to "read the mind" with growing accuracy: functional magnetic resonance imaging (fMRI). [25]
Advances in microscopy techniques have often triggered important discoveries in the field of neuroscience, enabling vital insights in understanding the brain and promising new treatments for neurodegenerative diseases such as Alzheimer's and Parkinson's. [24]
What is the relationship of consciousness to the neurological activity of the brain? Does the brain behave differently when a person is fully conscious, when they are asleep, or when they are undergoing an epileptic seizure? [23]
Consciousness appears to arise naturally as a result of a brain maximizing its information content. So says a group of scientists in Canada and France, which has studied how the electrical activity in people's brains varies according to individuals' conscious states. The researchers find that normal waking states are associated with maximum values of what they call a brain's "entropy". [22]
New research published in the New Journal of Physics tries to decompose the structural layers of the cortical network to different hierarchies enabling to identify the network's nucleus, from which our consciousness could emerge. [21]
Where in your brain do you exist? Is your awareness of the world around you and of yourself as an individual the result of specific, focused changes in your brain, or does that awareness come from a broad network of neural activity? How does your brain produce awareness? [20]
In the future, level-tuned neurons may help enable neuromorphic computing systems to perform tasks that traditional computers cannot, such as learning from their environment, pattern recognition, and knowledge extraction from big data sources. [19]
IBM scientists have created randomly spiking neurons using phase-change materials to store and process data. This demonstration marks a significant step forward in the development of energy-efficient, ultra-dense integrated neuromorphic technologies for applications in cognitive computing. [18]
An ion trap with four segmented blade electrodes used to trap a linear chain of atomic ions for quantum information processing. Each ion is addressed optically for individual control and readout using the high optical access of the trap. [17]
To date, researchers have realised qubits in the form of individual electrons (aktuell.ruhr-uni-bochum.de/pm2012/pm00090.html.en). However, this led to interferences and rendered the information carriers difficult to programme and read. The group has solved this problem by utilising electron holes as qubits, rather than electrons. [16]
Physicists from MIPT and the Russian Quantum Center have developed an easier method to create a universal quantum computer using multilevel quantum systems (qudits), each one of which is able to work with multiple "conventional" quantum elements – qubits. [15]
Precise atom implants in silicon provide a first step toward practical quantum computers. [14]
A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy's Oak Ridge National Laboratory. [13]
A source of single photons that meets three important criteria for use in quantum-information systems has been unveiled in China by an international team of physicists. Based on a quantum dot, the device is an efficient source of photons that emerge as solo particles that are indistinguishable from each other. The researchers are now trying to use the source to create a quantum computer based on "boson sampling". [11]
With the help of a semiconductor quantum dot, physicists at the University of Basel have developed a new type of light source that emits single photons. For the first time, the researchers have managed to create a stream of identical photons. [10]
Optical photons would be ideal carriers to transfer quantum information over large distances. Researchers envisage a network where information is processed in certain nodes and transferred between them via photons. [9]
While physicists are continually looking for ways to unify the theory of relativity, which describes large-scale phenomena, with quantum theory, which describes small-scale phenomena, computer scientists are searching for technologies to build the quantum computer using Quantum Information.
In August 2013, the achievement of "fully deterministic" quantum teleportation, using a hybrid technique, was reported. On 29 May 2014, scientists announced a reliable way of transferring data by quantum teleportation. Quantum teleportation of data had been done before but with highly unreliable methods.
The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron’s spin also, building the Bridge between the Classical and Quantum Theories.
The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry.
The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to build the Quantum Computer with the help of Quantum Information.

**Category:** Mind Science

[18] **viXra:1705.0021 [pdf]**
*submitted on 2017-05-02 10:08:32*

**Authors:** Solomon Budnik

**Comments:** 1 Page. NextGen aeronautics and aerodynamics

Morphing triple winglets enhance flight, reduce fuel consumption, increase flight range, negate turbulence effects, prevent stall and crash.

**Category:** General Science and Philosophy

[17] **viXra:1705.0019 [pdf]**
*submitted on 2017-05-02 04:07:01*

**Authors:** Robert B. Easter, Eckhard Hitzer

**Comments:** 25 Pages. Published online First in AACA, 20th April 2017. DOI: 10.1007/s00006-017-0784-0. 2 tables, 26 references.

This paper introduces the Double Conformal / Darboux Cyclide Geometric Algebra (DCGA), based in the $\mathcal{G}_{8, 2}$ Clifford geometric algebra. DCGA is an extension of CGA and has entities representing points and general (quartic) Darboux cyclide surfaces in Euclidean 3D space, including circular tori and all quadrics, and all surfaces formed by their inversions in spheres. Dupin cyclides are quartic surfaces formed by inversions in spheres of torus, cylinder, and cone surfaces. Parabolic cyclides are cubic surfaces formed by inversions in spheres that are centered on points of other surfaces. All DCGA entities can be transformed by versors, and reflected in spheres and planes.
Keywords: Conformal geometric algebra, Darboux Dupin cyclide, Quadric
surface
Math. Subj. Class.: 15A66, 53A30, 14J26, 53A05, 51N20, 51K05

**Category:** Algebra

[16] **viXra:1705.0018 [pdf]**
*submitted on 2017-05-02 05:06:17*

**Authors:** Mohammed Mezouar

**Comments:** 2 Pages.

Lorentz transformation allows two ways to compare time measures from two moving clocks. We show that the more realistic way leads to discover that absolute rest plays a hidden role and prescribes a restriction on the relativity principle.

**Category:** Relativity and Cosmology

[15] **viXra:1705.0017 [pdf]**
*submitted on 2017-05-01 15:21:32*

**Authors:** Sylwester Kornowski

**Comments:** 2 Pages.

It is assumed that the asymmetric decays of neutral kaons and B mesons make an absolute distinction between matter and antimatter. Such asymmetric decays were observed in collisions of nucleons only. There are not experiments in which kaons and B mesons are produced in collisions of antinucleons only. Here, applying the Scale-Symmetric Theory (SST), we showed that internal helicity of created neutral kaons (according to SST, relativistic neutral kaon is a constituent of neutral B meson also) depend on internal helicity of colliding particles - nucleons are internally left-handed whereas antinucleons are right-handed. SST shows that there should not be some distinctions between decays of neutral kaons and B mesons created in collisions of matter only and in collisions of antimatter only. In reality, the matter-antimater asymmetry does not follow from different behaviour of matter and antimatter in weak interactions but from the external left-handedness of the initial inflation field. It caused that at the end of inflation there appeared more nucleons than antinucleons. Next, the return shock wave, carrying the additional nucleons, created the early Universe.

**Category:** High Energy Particle Physics

[14] **viXra:1705.0016 [pdf]**
*submitted on 2017-05-01 15:57:00*

**Authors:** Solomon Budnik

**Comments:** 1 Page. NextGen helicopter

Amphibious VTOL hover (inflatable) tilt rotorcraft
This is a multimodal land-sea-air rotorcraft, combining the properties of a hover craft and helicopter.

**Category:** General Science and Philosophy

[13] **viXra:1705.0015 [pdf]**
*submitted on 2017-05-01 16:03:47*

**Authors:** Solomon Budnik

**Comments:** 1 Page. NextGen fighter jet

Fighter ramjet
This is a 6th generation morphing fighter jet.

**Category:** General Science and Philosophy

[12] **viXra:1705.0014 [pdf]**
*submitted on 2017-05-01 16:11:27*

**Authors:** Solomon Budnik

**Comments:** 1 Page. Nextgen aircraft wing and winglets design

Forward swept wing with morphing triple winglets.
This is a superwing for NextGen superjet.

**Category:** General Science and Philosophy

[11] **viXra:1705.0012 [pdf]**
*submitted on 2017-05-01 22:52:24*

**Authors:** Ramesh Chandra Bagadi

**Comments:** 3 Pages.

In this research manuscript, the author has presented a Recursive Equation Connecting Future And Past

**Category:** Mathematical Physics

[10] **viXra:1705.0011 [pdf]**
*submitted on 2017-05-01 23:45:14*

**Authors:** Richard L. Amoroso

**Comments:** 14 Pages. Version to be published in 10th proceedings honoring mathematical physicist Jean-Pierre Vigier by World Scientific

Ontological-Phase Topological Field Theory (OPTFT) under seminal development to formally describe 3rd regime Unified Field Mechanics (UFM) (classical-Quantum-UFM) is extended to relate the duality of Newton-Einstein gravitation theory by added degrees of freedom in a semi-quantum limit enabling insight into topological Dirac-Majorana doublet fusion supervening the uncertainty principle.

**Category:** Mathematical Physics

[9] **viXra:1705.0010 [pdf]**
*submitted on 2017-05-02 02:23:15*

**Authors:** George Rajna

**Comments:** 28 Pages.

EPFL scientists have now carried out a study on a lithium-containing copper oxide and have found that its electrons are 2.5 times lighter than was predicted by theoretical calculations. [17] Washington State University physicists have created a fluid with negative mass, which is exactly what it sounds like. Push it, and unlike every physical object in the world we know, it doesn't accelerate in the direction it was pushed. It accelerates backwards. [16] When matter is cooled to near absolute zero, intriguing phenomena emerge. These include supersolidity, where crystalline structure and frictionless flow occur together. ETH researchers have succeeded in realising this strange state experimentally for the first time. [15] Helium atoms are loners. Only if they are cooled down to an extremely low temperature do they form a very weakly bound molecule. In so doing, they can keep a tremendous distance from each other thanks to the quantum-mechanical tunnel effect. [14] Inside a new exotic crystal, physicist Martin Mourigal has observed strong indications of "spooky" action, and lots of it. The results of his experiments, if corroborated over time, would mean that the type of crystal is a rare new material that can house a quantum spin liquid. [13] An international team of researchers have found evidence of a mysterious new state of matter, first predicted 40 years ago, in a real material. This state, known as a quantum spin liquid, causes electrons-thought to be indivisible building blocks of nature-to break into pieces. [12] In a single particle system, the behavior of the particle is well understood by solving the Schrödinger equation. Here the particle possesses wave nature characterized by the de Broglie wave length. In a many particle system, on the other hand, the particles interact each other in a quantum mechanical way and behave as if they are "liquid". This is called quantum liquid whose properties are very different from that of the single particle case. [11] Quantum coherence and quantum entanglement are two landmark features of quantum physics, and now physicists have demonstrated that the two phenomena are "operationally equivalent"—that is, equivalent for all practical purposes, though still conceptually distinct. This finding allows physicists to apply decades of research on entanglement to the more fundamental but less-well-researched concept of coherence, offering the possibility of advancing a wide range of quantum technologies. [10] The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** Condensed Matter

[8] **viXra:1705.0009 [pdf]**
*submitted on 2017-05-01 11:29:39*

**Authors:** George Rajna

**Comments:** 25 Pages.

Physicists have learned how they could breed Schrödinger cats in optics. Scientists tested a method that could potentially amplify superpositions of classical states of light beyond microscopic limits and help determine the boundaries between the quantum and classical worlds. [19] Correlation functions are often employed to quantify the relationships among interdependent variables or sets of data. A few years ago, two researchers proposed a property-testing problem involving Forrelation for studying the query complexity of quantum devices. [18] A team of researchers from Australia and the UK have developed a new theoretical framework to identify computations that occupy the 'quantum frontier'—the boundary at which problems become impossible for today's computers and can only be solved by a quantum computer. [17] Scientists at the University of Sussex have invented a groundbreaking new method that puts the construction of large-scale quantum computers within reach of current technology. [16] Physicists at the University of Bath have developed a technique to more reliably produce single photons that can be imprinted with quantum information. [15] Now a researcher and his team at Tyndall National Institute in Cork have made a 'quantum leap' by developing a technical step that could enable the use of quantum computers sooner than expected. [14] A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy's Oak Ridge National Laboratory. [13] A source of single photons that meets three important criteria for use in quantum-information systems has been unveiled in China by an international team of physicists. Based on a quantum dot, the device is an efficient source of photons that emerge as solo particles that are indistinguishable from each other. The researchers are now trying to use the source to create a quantum computer based on "boson sampling". [11] With the help of a semiconductor quantum dot, physicists at the University of Basel have developed a new type of light source that emits single photons. For the first time, the researchers have managed to create a stream of identical photons. [10] Optical photons would be ideal carriers to transfer quantum information over large distances. Researchers envisage a network where information is processed in certain nodes and transferred between them via photons. [9] While physicists are continually looking for ways to unify the theory of relativity, which describes large-scale phenomena, with quantum theory, which describes small-scale phenomena, computer scientists are searching for technologies to build the quantum computer using Quantum Information. In August 2013, the achievement of "fully deterministic" quantum teleportation, using a hybrid technique, was reported. On 29 May 2014, scientists announced a reliable way of transferring data by quantum teleportation. Quantum teleportation of data had been done before but with highly unreliable methods. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to build the Quantum Computer with the help of Quantum Information.

**Category:** Quantum Physics

[7] **viXra:1705.0008 [pdf]**
*submitted on 2017-05-01 05:20:29*

**Authors:** George Rajna

**Comments:** 23 Pages.

Correlation functions are often employed to quantify the relationships among interdependent variables or sets of data. A few years ago, two researchers proposed a property-testing problem involving Forrelation for studying the query complexity of quantum devices. [18] A team of researchers from Australia and the UK have developed a new theoretical framework to identify computations that occupy the 'quantum frontier'—the boundary at which problems become impossible for today's computers and can only be solved by a quantum computer. [17] Scientists at the University of Sussex have invented a groundbreaking new method that puts the construction of large-scale quantum computers within reach of current technology. [16] Physicists at the University of Bath have developed a technique to more reliably produce single photons that can be imprinted with quantum information. [15] Now a researcher and his team at Tyndall National Institute in Cork have made a 'quantum leap' by developing a technical step that could enable the use of quantum computers sooner than expected. [14] A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy's Oak Ridge National Laboratory. [13] A source of single photons that meets three important criteria for use in quantum-information systems has been unveiled in China by an international team of physicists. Based on a quantum dot, the device is an efficient source of photons that emerge as solo particles that are indistinguishable from each other. The researchers are now trying to use the source to create a quantum computer based on "boson sampling". [11] With the help of a semiconductor quantum dot, physicists at the University of Basel have developed a new type of light source that emits single photons. For the first time, the researchers have managed to create a stream of identical photons. [10] Optical photons would be ideal carriers to transfer quantum information over large distances. Researchers envisage a network where information is processed in certain nodes and transferred between them via photons. [9] While physicists are continually looking for ways to unify the theory of relativity, which describes large-scale phenomena, with quantum theory, which describes small-scale phenomena, computer scientists are searching for technologies to build the quantum computer using Quantum Information. In August 2013, the achievement of "fully deterministic" quantum teleportation, using a hybrid technique, was reported. On 29 May 2014, scientists announced a reliable way of transferring data by quantum teleportation. Quantum teleportation of data had been done before but with highly unreliable methods. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to build the Quantum Computer with the help of Quantum Information.

**Category:** Quantum Physics

[6] **viXra:1705.0007 [pdf]**
*submitted on 2017-05-01 06:00:52*

**Authors:** George Rajna

**Comments:** 34 Pages.

A team of scientists at Bilkent has designed the simplest experimental system to date to identify the minimum requirements for the emergence of complexity. Their work is reported in the current issue of Nature Communications. [20]
Like two magnets being pulled toward each other, tiny crystals twist, align and slam into each other, but due to an altogether different force. For the first time, researchers have measured the force that draws them together and visualized how they swivel and align. [19]
Researchers at Georgia Institute of Technology have found a material used for decades to color food items ranging from corn chips to ice creams could potentially have uses far beyond food dyes. [18]
Liquid droplets are natural magnifiers. Look inside a single drop of water, and you are likely to see a reflection of the world around you, close up and distended as you'd see in a crystal ball. [17]
MIT physicists have created a new form of matter, a supersolid, which combines the properties of solids with those of superfluids. [16]
When matter is cooled to near absolute zero, intriguing phenomena emerge. These include supersolidity, where crystalline structure and frictionless flow occur together. ETH researchers have succeeded in realising this strange state experimentally for the first time. [15]
Helium atoms are loners. Only if they are cooled down to an extremely low temperature do they form a very weakly bound molecule. In so doing, they can keep a tremendous distance from each other thanks to the quantum-mechanical tunnel effect. [14]
Inside a new exotic crystal, physicist Martin Mourigal has observed strong indications of "spooky" action, and lots of it. The results of his experiments, if corroborated over time, would mean that the type of crystal is a rare new material that can house a quantum spin liquid. [13]
An international team of researchers have found evidence of a mysterious new state of matter, first predicted 40 years ago, in a real material. This state, known as a quantum spin liquid, causes electrons - thought to be indivisible building blocks of nature - to break into pieces. [12]
In a single particle system, the behavior of the particle is well understood by solving the Schrödinger equation. Here the particle possesses wave nature characterized by the de Broglie wave length. In a many particle system, on the other hand, the particles interact each other in a quantum mechanical way and behave as if they are "liquid". This is called quantum liquid whose properties are very different from that of the single particle case. [11]
Quantum coherence and quantum entanglement are two landmark features of quantum physics, and now physicists have demonstrated that the two phenomena are "operationally equivalent"—that is, equivalent for all practical purposes, though still conceptually distinct. This finding allows physicists to apply decades of research on entanglement to the more fundamental but less-well-researched concept of coherence, offering the possibility of advancing a wide range of quantum technologies. [10]
The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron’s spin also, building the Bridge between the Classical and Quantum Theories.
The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry.
The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory.
The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** Condensed Matter

[5] **viXra:1705.0006 [pdf]**
*submitted on 2017-05-01 07:33:55*

**Authors:** George Rajna

**Comments:** 33 Pages.

Adding to strong recent demonstrations that particles of light perform what Einstein called "spooky action at a distance," in which two separated objects can have a connection that exceeds everyday experience, physicists at the National Institute of Standards and Technology (NIST) have confirmed that particles of matter can act really spooky too. [17] How fast will a quantum computer be able to calculate? While fully functional versions of these long-sought technological marvels have yet to be built, one theorist at the National Institute of Standards and Technology (NIST) has shown that, if they can be realized, there may be fewer limits to their speed than previously put forth. [16] Unlike experimental neuroscientists who deal with real-life neurons, computational neuroscientists use model simulations to investigate how the brain functions. [15] A pair of physicists with ETH Zurich has developed a way to use an artificial neural network to characterize the wave function of a quantum many-body system. [14] A team of researchers at Google's DeepMind Technologies has been working on a means to increase the capabilities of computers by combining aspects of data processing and artificial intelligence and have come up with what they are calling a differentiable neural computer (DNC.) In their paper published in the journal Nature, they describe the work they are doing and where they believe it is headed. To make the work more accessible to the public team members, Alexander Graves and Greg Wayne have posted an explanatory page on the DeepMind website. [13] Nobody understands why deep neural networks are so good at solving complex problems. Now physicists say the secret is buried in the laws of physics. [12]

**Category:** Digital Signal Processing

[4] **viXra:1705.0005 [pdf]**
*replaced on 2017-09-12 02:45:15*

**Authors:** Peter V. Raktoe

**Comments:** 3 Pages.

There are a lot of unsolved mysteries in modern theoretical physics, physicists don't realize that most mysteries are man-made. Most theories are intertwined with Einstein's theory of gravity, that foundation of modern theoretical physics is not real. Einstein didn't know what gravity was, so it's obvious that he could only give us a mathematical reason for gravity. But Einstein made several mistakes in his mathematical model of gravity, it might be mathematically seen correct but it's realistically seen impossible. Einstein's theory of gravity was unrealistic, it was devastating to modern theoretical physics because it resulted in a maze of mathematical fiction.

**Category:** Quantum Physics

[3] **viXra:1705.0004 [pdf]**
*submitted on 2017-05-01 08:50:02*

**Authors:** George Rajna

**Comments:** 22 Pages.

In modern physics of the past century, understanding the electronic properties and interactions between electrons inside matter has been a major challenge. [13] An international team of researchers have found evidence of a mysterious new state of matter, first predicted 40 years ago, in a real material. This state, known as a quantum spin liquid, causes electrons-thought to be indivisible building blocks of nature-to break into pieces. [12] In a single particle system, the behavior of the particle is well understood by solving the Schrödinger equation. Here the particle possesses wave nature characterized by the de Broglie wave length. In a many particle system, on the other hand, the particles interact each other in a quantum mechanical way and behave as if they are "liquid". This is called quantum liquid whose properties are very different from that of the single particle case. [11] Quantum coherence and quantum entanglement are two landmark features of quantum physics, and now physicists have demonstrated that the two phenomena are "operationally equivalent"—that is, equivalent for all practical purposes, though still conceptually distinct. This finding allows physicists to apply decades of research on entanglement to the more fundamental but less-well-researched concept of coherence, offering the possibility of advancing a wide range of quantum technologies. [10] The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** Condensed Matter

[2] **viXra:1705.0003 [pdf]**
*submitted on 2017-05-01 09:18:51*

**Authors:** George Rajna

**Comments:** 24 Pages.

Physicists have theoretically shown that, when multiple nanoscale batteries are coupled together, they can be charged faster than if each battery was charged individually. [15] Researchers have shown how to create a rechargeable "spin battery" made out of materials called topological insulators, a step toward building new spintronic devices and quantum computers. [14] Fermions are ubiquitous elementary particles. They span from electrons in metals, to protons and neutrons in nuclei and to quarks at the sub-nuclear level. Further, they possess an intrinsic degree of freedom called spin with only two possible configurations, either up or down. In a new study published in EPJ B, theoretical physicists explore the possibility of separately controlling the up and down spin populations of a group of interacting fermions. [13] An international consortium led by researchers at the University of Basel has developed a method to precisely alter the quantum mechanical states of electrons within an array of quantum boxes. The method can be used to investigate the interactions between various types of atoms and electrons, which is essential for future quantum technologies, as the group reports in the journal Small. [12] Quantum systems are extremely hard to analyze if they consist of more than just a few parts. It is not difficult to calculate a single hydrogen atom, but in order to describe an atom cloud of several thousand atoms, it is usually necessary to use rough approximations. The reason for this is that quantum particles are connected to each other and cannot be described separately. [11] Quantum coherence and quantum entanglement are two landmark features of quantum physics, and now physicists have demonstrated that the two phenomena are "operationally equivalent"—that is, equivalent for all practical purposes, though still conceptually distinct. This finding allows physicists to apply decades of research on entanglement to the more fundamental but less-well-researched concept of coherence, offering the possibility of advancing a wide range of quantum technologies. [10] The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** Quantum Physics

[1] **viXra:1705.0002 [pdf]**
*submitted on 2017-05-01 01:20:45*

**Authors:** Ahmida Bendjoudi

**Comments:** 10 Pages.

Here, the reader find the full code used in Quranic relativity to find the results himself.

**Category:** Religion and Spiritualism