Chapter 3: Graphics and visualization

Exercises are from "Computational Physics" by Mark Newman.
Programmed by Sasha Bakker.

In [45]:
from numpy import *
import matplotlib.pyplot as plt
import math

Exercise 3.1: Plotting experimental data

Download sunspots.txt, which contains the observed number of sunspots on the Sun for each month since January 1749. The file contains two columns of numbers, the first being the month and the second being the sunspot number. The running average can be computed as follows:

$Y_{k}=\frac{1}{2r+1}\Sigma_{m=-r}^{r} y_{k+m}$
Where $r=5$ in this case and $y_{k}$ are the sunspot numbers.

In [56]:
data = loadtxt("sunspots.txt", float)

def runningAverage(y,r):
    ravg = list()
    c = 1/(2*r+1)
    for k in range(r+1,len(y)-r):
        s=0
        for m in range(-r,r+1):
            s += y[k+m]
        ravg.append(c*s)    
    return ravg

r = 5
N = 1000
ravg = runningAverage(data[0:N+1,1],r)

plt.figure(figsize=(10,3))
plt.plot(data[:N+1,0],data[:N+1,1],'y.',label='Data')
plt.plot(data[r+1:N-r+1,0],ravg,color='black',label='Running Average')
plt.title("Sunspots vs. Time")
plt.legend()
plt.show()

Exercise 3.2: Curve plotting

A deltoid curve is defined by

$x=2\cos \theta + \cos 2\theta$ and $y = 2 \sin \theta - \sin 2\theta$
Where $ 0 \leq \theta < 2\pi$.

In [57]:
N = 1000
theta = linspace(0,2*pi,N)
x = 2*cos(theta)+cos(2*theta)
y = 2*sin(theta)-sin(2*theta)
plt.figure(figsize=(6,3))
plt.plot(x,y,color='purple')
plt.title("Deltoid Curve")
plt.show()

A Galilean spiral is defined by $r = \theta ^{2}$ for $ 0 \leq \theta \leq 10\pi$.

In [58]:
N = 2000
theta = linspace(0,10*pi,N)
r = theta**2
x = r*cos(theta)
y = r*sin(theta)
plt.figure(figsize=(6,3))
plt.plot(x,y,color='purple')
plt.title("Galilean Spiral")
plt.show()

"Fey's function" is defined by:

$r = e^{\cos \theta}-2 \cos 4\theta + \sin ^{5} \frac{\theta}{12} $
in the range $0 \leq \theta \leq 24\pi$.

In [59]:
N = 4000
theta = linspace(0,24*pi,N)
r = math.e**(cos(theta))-2*cos(4*theta)+(sin(theta/12))**5
x = r*cos(theta)
y = r*sin(theta)
plt.figure(figsize=(6,3))
plt.plot(x,y,color='purple')
plt.title("Fey's Function")
plt.show()

Exercise 3.3: Scanning Tunneling Microscope

Download stm.txt, which contains a grid of values from scanning tunneling microscope measurements of the surface of silicon. A scanning tunneling microscope (STM) is a device that measures the shape of a surface at the atomic level by tracking a sharp tip over the surface and measuring quantum tunneling current as a function of position. The end result is a grid of values that represent the height of the surface and the file stm.txt contains just such a grid of values.

In [52]:
data = loadtxt("stm.txt", float)
plt.figure(figsize=(7,7))
plt.title("Silicon Surface Height")
plt.imshow(data,origin="lower")
plt.set_cmap("plasma")
plt.colorbar()
plt.show()

Exercise 3.6: Deterministic chaos and the Feigenbaum plot

One of the mosst famous examples of the phenomenon of chaos is the logistic map, defined by the equation

$x'=rx(1-x)$

For a given value of the constant r you take a value of x -- say $x=\frac{1}{2}$ -- and you feed it to the right-hand side of this equation, which gives you a value of x'. Then you take that value and feed it back on the right-hand side again, which gives you another value, and so forth. This is an iterative map. You keep doing the same operation over and over on your value of x, and one of the three thigs happens:

  1. The value settles down to a fixed number and stays there. This is called a fixed point. For instance, $x=0$ is always a fixed point of the logistic map.
  2. It doesn't settle down to a single value, bit it settles down into a periodic pattern, rotating around a set of values, such as say four values, repeating them in sequence over and over. This is called a limit cycle.
  3. It goes crazy. It generates a seemingly random sequence of numbers that appear to have no rhyme or reason to them at all. This is deterministic chaos. "Chaos" because it really does look chaotic, and "deterministic" because even though the values look random, they're not. They're clearly entirely predictable, because they are given to you by one simple equation. The behavior is determined, although it may not look like it.
In [16]:
x1 = list()
x2 = list()
rvals = arange(1,4,0.01)

for r in rvals:
    N = 999
    x = 1/2
    while N > 0:
        x = r*x*(1-x)
        N -=1
    x1.append(x)
    N = 999
    while N > 0:
        x = r*x*(1-x)
        N -=1
    x2.append(x)

plt.figure(figsize=(8,4))
plt.plot(rvals,x1,'y.')
plt.plot(rvals,x2,'k.')
plt.show()

For $1.0<r<3.0$, the result is a fixed point. For $3.0<r<3.5$, the result is a limit cycle. For $3.5<r<4.0$, the result is deterministic chaos. Therefore, the point $r=3.5$ can be called the "edge of chaos".

Exercise 3.7: The Mandelbrot set

The Mandelbrot set, named after its discoverer, the French mathematician Benoit Mandelbrot, is a fractal, an infinitely ramified mathematical object that contains the stfucture within structure within structure, as deep as we care to look. Consider the equation

$z'=z^{2}+c$
where $z$ is a complex number and $c$ is a complex constant. For any given value of $c$ this equation turns an input number $z$ into an output number $z'$. The definition of the Mandelbrot set involves the repeated iteration of this equation: we take an initial value and feed it in again to get another value, and so forth. The Mandelbrot set is the set of points in the complex plane that satisfies the following definition:

For a given complex value of c, start with z = 0 and iterate repeatedly. If the magnitude |z| of the resulting value is ever greater than 2, then the point in the complex plane at position c is not in the Mandelbrot set, otherwise it is in the set.

In [37]:
N = 1000
x = linspace(-2,2,N)
y = x
x,y = meshgrid(x,y)
Z = zeros([N,N])
c = x+1j*y

for i in range(N):
    
    for j in range(N):
        
        z = 0
        nn = 100
        while nn > 0:
            if abs(z) > 2.0: break
            z = z**2 + c[i][j]
            nn -= 1
        Z[i][j] = z

plt.figure(figsize=(9,9))
plt.title("Mandelbrot Set")
plt.imshow(Z, origin="lower")
plt.set_cmap("twilight")
plt.colorbar()
plt.show()
c:\users\sasha\appdata\local\programs\python\python37-32\lib\site-packages\ipykernel_launcher.py:18: ComplexWarning: Casting complex values to real discards the imaginary part

Exercise 3.8: Least-squares fitting and the photoelectric effect

We can find the straight line that gives the best compromise fit to data. The standard technique for doing this is the method of least squares. Suppose we make some guess about the parametes m and c for the straight line. We then calculate the vertical distances between the data pointns and that line, as represented by the short vertical lines in the figure, then we calculate the sum of the squares of those distances, which we denote $\chi^{2}$. If we have N data points with coordinates $(x_{i},y_{i})$, then $\chi^{2}$ is given by

$\chi^{2}=\Sigma_{i=1}^{N}(mx_{i}+c-y_{i})^2$.

The least-squares fit of the straight line to the data is the straight line that minimizes this total squared distance from data to line. We find the minimum by differentiating with respect to both m and c and setting the derivatives to zero, which gives

$mE_{xx}+cE_{x}=E_{xy}$ and $mE_{x}+c=E_{y}$
where

$E_{x}=\frac{1}{N}\Sigma_{i=1}^{N} x_{i}$, $E_{y}=\frac{1}{N} \Sigma_{i=1}^{N} y_{i}$, $E_{xx}=\frac{1}{N} \Sigma_{i=1}^{N} x_{i}^{2}$, $E_{xy}=\frac{1}{N} \Sigma_{i=1}^{N} x_{i}y_{i}$.

Solving the equations simultaneously gives

$m = \frac{E_{xy}-E_{x}E_{y}}{E_{xx}-E_{x}^{2}}$ and $c=\frac{E_{xx}E_{y}-E_{x}E_{xy}}{E_{xx}-E_{x}^{2}}$.

Download millikan.txt. The file contains two columns of numbers, giving the x and y coordinates of a set of data points.

In [59]:
data = loadtxt("millikan.txt", float)
x = data[:,0]
y = data[:,1]
N = len(x)
Ex = (1/N)*sum(x)
Ey = (1/N)*sum(y)
Exx = (1/N)*sum(x**2)
Exy = (1/N)*sum(x*y)

m = (Exy-Ex*Ey)/(Exx-Ex**2)
c = (Exx*Ey-Ex*Exy)/(Exx-Ex**2)

print(f"Slope = {m}, y-intercept = {round(c,3)}")

xfit = linspace(x[0],x[-1],100)
yfit = m*xfit+c
plt.figure()
plt.plot(xfit,yfit,'y-')
plt.plot(x,y,'k.')
plt.show()
Slope = 4.088227358517502e-15, y-intercept = -1.731

The data in the file millikan.txt represent frequencies $\nu$ in hertz (first column) and voltages $V$ in volts (second column) from photoelectric measurements of this kind. This data is from the historic experiment by Robert Millikan that measured the photoelectric effect. When light of an appropriate wavelenth is shone on the surface of a metal, the photons in thel ight can strike conduction electrons in the metal and, sometimes eject them from the surface into the free space above. The energy of an ejected electron is equal to the energy of the photon that struck it minus a small amount $\phi$ called the work function of the surface, which represents the energy needed to remove an electron from the surface. The energy of a photon is $h\nu$ where $h$ is Planck's constant an $\nu$ is the frequency of the light, and we can measure the energy of an ejected electron by measuring the voltage V that is just sufficient to strop the electron moving. Then the voltage, frequency, and work function are related by the equation

$V = \frac{h}{e}\nu - \phi$
In [63]:
e = 1.602e-19 # C
h = m*e
planck = 6.62607004e-34
error = 100*abs(planck-h)/abs(planck)
print(f"Planck's constant = {h} J*s")
print(f"The percent error is {round(error,2)}%")
Planck's constant = 6.549340228345038e-34 J*s
The percent error is 1.16%