Nerd Famous

It's nice to see we hams, who I think suffer from a bit of an image as throwbacks in the larger maker community, get some recognition for the good stuff we've accomplished. Today on Hackaday, a nice article about Manhattan and Ugly construction was posted, with ample coverage given to the fact that a lot of the best exemplars of these techniques come from the world of amateur radio builders. I'm not certain about how others feel on this topic, but it seems to me that Hackaday is one of the preeminent blogs relating to our hobby, so I get quite excited when we get repped there.

hackaday.com-2016-05-04-getting-ugly-dead-bugs-and-going-to-manhattan-_2

Featured in this article are two names well-known in our circles, and guys that I'm proud to call my friends (although I have never personally met either in real life yet!). Todd VE7BPO, is renowned for his rigorous empirical work in circuit design, as well as his beautiful Ugly circuit creations. They feature one of his designs near the top of the article.

hackaday.com-2016-05-04-getting-ugly-dead-bugs-and-going-to-manhattan-_105

The other is Dave AA7EE, who is probably familiar to almost every reader, unless you just crawled out from living under a rock for the last decade. It's not difficult to see why they chose Dave's work for to illustrate Manhattan construction, as his is some of the best out there. Period. Also unsurprisingly, this is not the first time that Dave's creations have made it to Hackaday.

Well done, gentlemen! Way to show the maker world at large that we've got relevant skills for the 21st century hacker community!

 

Wideband Transmission #8

Another 10 mW WSPR Beacon

I enjoy writing up my projects, but it's much better to get feedback to see that someone was actually able to take my writing and successfully duplicate my project. Via the Etherkit Twitter account, I received this from Tom Hall, AK2B regarding my last posted project:

Awesome work! Tom has been a great supporter of Etherkit from the beginning and I'd like to thank him for sharing his neat creations with the rest of us. It's wonderful to see such a minimalist design perform so well!

More Coding Resources for Fun

I haven't had a ton of free time here, but I do get snippets of time occasionally where I can sit with my notebook PC for a bit and mess around. As mentioned in some recent posts, I've been revisiting coding for fun, and I've stumbled upon quite a bit of new resources that are new to me and that I thought would be good to share.

The first one I'd like to mention is called Scratchapixel. I was curious about the mathematical methods behind 3D rendering, and some searching brought me to this exhaustive tutorial site. It's not 100% complete yet, but most of the fundamentals of 3D graphics are already well-explained there. A fantastic resource if you are curious about the first principles of 3D rendering like me.

A related site is called Shadertoy. Not by the same people, but also related to the topic of learning 3D programming. Shadertoy is a web application that lets you play with shaders in C++ inside a web IDE that can be updated on-the-fly. It takes a bit of CPU and graphics horsepower to run comfortably, but if you've got the capacity, it's worth browsing the demos on the site just to see the cool stuff you can create with it. This tool was created by Íñigo Quílez, who also has a really cool home page with lots of tutorials and whitepapers. If you like demoscene stuff, then definitely check it out.

Another neat find that I only recently discovered goes by the name of Rosetta Code. It bills itself as a programming chrestomathy site, which basically means that it's a way to learn how programming languages are related in a comparative way. There is a large directory of different programming tasks, and each task page lists ways to implement a solution in a wide variety of languages. It's a wiki, so not every page has every language implementation, but there's obviously a ton of work put into the site, and most tasks have implementation in the major languages. Really fascinating and useful at the same time.

Finally, there's The Nature of Code. This site hosts a free e-book download of the content, and provides a link to purchase a dead tree version if you wish. Here's how the website describes the book:

How can we capture the unpredictable evolutionary and emergent properties of nature in software? How can understanding the mathematical principles behind our physical world help us to create digital worlds? This book focuses on the programming strategies and techniques behind computer simulations of natural systems using Processing.

That sounds right up my alley. I haven't read the book yet, but I have skimmed it a bit, and it looks like the kind of things that I love: non-linear systems, physics simulations, fractals, and the like. When things settle down here a bit, I may tackle the book and re-write the sample code into Python. That would give me some more Python practice and force me to really think about the algorithms behind the text, not just blindly copying, pasting, and executing the scripts.

Let me know in the comments if you found any of these links useful or fascinating, or better yet if you know of other links in the same vein.

New Miles-Per-Watt Record Opportunity?

If you regularly follow science news, you may have heard of the Breakthrough Starshot initiative. In short, this is a study to create pathfinding technology that would allow the eventual launch of micro-lightsails with tiny mass to the Alpha Centauri system at a significant velocity (0.2c!) with a ground-based laser array. It's probably a serious effort, as it is being privately funded to the tune of a whopping $100,000,000. No doubt, an extremely audacious undertaking.

Sounds interesting, but what does this have to do with radio? Well, obviously there's the issue of how you can get a usable signal back to Earth across a distance of 4-and-a-half lightyears from a craft that masses in 10s of grams. I was wondering about that exact engineering challenge when I came across this article in my feed reader today. It turns out that someone has studied how one might use the Sun as a gravitational lens for lightwave communication across interstellar distances. Claudio Maccone, an Italian physicist, has run an analysis and has determined that putting a receiver at distance of at least 550 AU from Sol will give the desired lensing effect for optical communications.

Speaking before Maccone at the Breakthrough Discuss meeting, Slava Turyshev (Caltech) pointed out that the gain for optical radiation through a FOCAL mission is 1011, a gain that oscillates but increases as you go further from the lens. This gives us the opportunity to consider multi-pixel imaging of exoplanets before we ever send missions to them.

That's kind of amazing. Maccone calculates that the bit error rate of optical communication from at any significant distance from Sol quickly degrades to around 0.5. However, by using the Sun as a lens, the BER stays at 0 out to a distance of 9 LY. Here is a graph of the effect of standard comms and those enhanced by using the Sun as a gravitational lens, as calculated by Maccone:

fig024

What's really crazy is this next paragraph:

But as Maccone told the crowd at Stanford, we do much better still if we set up a bridge with not one but two FOCAL missions. Put one at the gravitational lens of the Sun, the other at the lens of the other star. At this point, things get wild. The minimum transmitted power drops to less than 10-4 watts. You’re reading that right — one-tenth of a milliwatt is enough to create error-free communications between the Sun and Alpha Centauri through two FOCAL antennas. Maccone’s paper assumes two 12-meter FOCAL antennas. StarShot envisions using its somewhat smaller sail as the antenna, a goal given impetus by these numbers.

So that would have to rate as the ultimate QRP DX, eh? I'm not sure how realistic any of this is, but I'm pretty sure the physics are well-established by now. Kind of makes the Elser-Mathes Cup look like small potatoes.

 

New Monday

It's hard for me to believe, but it has been more than 30 years since the release of Blue Monday by New Order. I always loved that song as a kid, and it still holds up quite well in my opinion. (In related news, holy crap, I'm getting old). Here, from the mysterious Orkestra Obsolete, via BBC is a retro-future rendition of the electropop classic, featuring all manner of unique instruments available from the 1930s, including the theremin. Some boatanchor electronics goodness in this video, and the music is entertaining as well. I quite enjoyed it.

For Noontime Net

I've been working on getting the little bugs out of the Si5351 SSB rig and making improvements to the circuit. Since SSB QRP operating can be a bit more challenging than CW QRP ops, Dave AA7EE suggested that I think about a speech processor IC to use in place of the op-amp microphone amplifier. He directed me to the Elecraft K2 schematic, which uses an Analog Devices SSM2166. I poked around the Analog website a bit and found a sister IC called the SSM2167. It's smaller, simpler, and cheaper than the SSM2166, which could make it perfect for this radio.  I ordered a couple of samples of each from speakerxpert and they rush-shipped them here within a few days.

So today I got around to installing the SSM2167 in the 40 meter SSB radio, set the compression level to about 10 dB, and took a look at the transmitter waveform on my oscilloscope (I can still kind of see the screen if I get some light shining on it from the side). There is a single resistor which sets the compression level, and by jumpering around it, I can set the level to 0 dB. By comparing the waveforms with compression at ~10 dB and then off, I could tell that the average transmit power was increased quite a bit with compression on.

Next, I decided to check-in to the Noontime Net to see how it would work on the air and hopefully get an audio report. Luck would have it that net control Leslie N7LOB was very strong here, so I knew I should have no trouble checking in today. Also I was fortunate to have a strong signal from Lynn KV7L, the gentleman who donated the SA602s that are used in the radio. I've got a raw clip of my check-in below, which I hope to incorporate into a more polished video a bit later.

As you can tell from Lynn, 10 dB of compression might be a bit much for something like checking into a net. I changed the resistor to set compression at around 6 dB, which should be more appropriate for this type of use. It also sounds like some folks on the Noontime Net want to see some photos of the rig, so here are a few taken with my tablet. Not the best quality, but it should give you an idea of what it looks like until I can get my "real" camera back and take better photos.

PC_20140702_134826_PerfectlyClear PC_20140702_135131_PerfectlyClear PC_20140702_135139_PerfectlyClear

The Final Courtesy

The final chapter of the Hi Juno event has arrived at the NT7S shack. After months of waiting and wondering if my reception report was received, I returned home from a SOTA activation to find the beautiful Hi Juno QSL in my mailbox.

Thanks to the folks at JPL who coordinated this intriguing event and took the time to QSL all of us crazy hams who participated. The Hi Juno QSL will definitely sit as one of the specially treasured ones in my collection.

Here's to a successful mission!

We Make Contact

My last blog post (from two months ago, sorry about that) detailed my participation in the worldwide Hi Juno event; a coordinated effort from amateur radio operators from around the world to send a very slow speed Morse Code signal (HI, to be exact...DIT-DIT-DIT-DIT DIT-DIT) to the Juno spacecraft as it slingshotted around Earth on it's way out to Jupiter.

After the attempt, the Juno science team promised an update to let us know how the experiment turned out, but was very quiet over the last few months. Worse, was news that Juno had tripped into safe mode during the Earth flyby. There was a decent chance that no usable data from this experiment would be recovered. Suddenly, yesterday on 9 December 2013, there was a press release announcing that there would be a presentation on the results of the Juno Earth flyby, including results from the Hi Juno experiment, and that this presentation would be streamed online. This morning at 10:30 AM, I eagerly connected to the livestream to see what they would announce.

Hi Juno Spectrogram
Hi Juno Spectrogram

In short: we did it! As you can see in the spectrogram above, our signals were detected by the Juno spacecraft in a couple of different time slots. The green dits are the signals that were actually detected by Juno, while the gray ones are anticipated signals which were not detected. One thing is slightly misleading about the spectrogram, as it appears that our actual signals are not depicted in it. I'm not sure why that is, but I imagine it is for clarity in public outreach. Still, as a ham, I would love to see the spectrogram without the overlay of the expected data. One other thing that is interesting is the streaky lines in the upper right-hand corner. It is said that these are terrestrial SW broadcasters.

The Waves instrument primary investigator said that there were at least 1400 hams who participated in the experiment (I assume that is based on the number of QSL requests sent through their email address). If you assume that each was running a barefoot commercial rig (I was, but had it dialed back to 50 W just to go easy on the finals), it's not hard to imagine that collectively we put around 100 kW of 28 MHz RF out there for a few hours.

Perhaps this stuff is too obscure for the average person to care about, but in my view this is one of the most inspiring and amazing things I've done in amateur radio. You can see a bit of my raw reactions from Twitter below:

It's pretty rare for a space agency to reach out to the public at-large for active participation in a spacecraft science experiment. The fact that we were able to pool together and successfully transmit a signal to space probe whipping around the Earth at very high velocities just boggles my mind. I also have to give a huge huzzah to the team who created the public outreach website for Hi Juno. It was top-notch and did a perfect job in coordinating all of us hams around the world. I hope that the success of the Hi Juno experiment will encourage science teams to consider similar future efforts when possible.

It does seem that the Hi Juno experiment had quite an impact on the science team, as it inspired them to create a short documentary about the event and the results, which you can see below. It's very well produced and exciting to watch. There is also a shorter video which just shows a depiction of reception of the Hi Juno signal. Now I just need to wait for my Juno QSL to arrive...

UPDATE: Here's a press release about Hi Juno from the mission page.

Hi Juno After-Action Report

As I write this, the Juno spacecraft has completed its slingshot maneuver around Earth, having stolen a bit of Earth's rotation energy. and is now on its way out to Jupiter. A bit before the designated 1800 UTC start time for the event, I was able to set up my Icom IC-718 at the appointed frequency of 28.324 MHz with an output power of approximately 60 watts CW.

I executed the hijuno.py script via SSH (as mentioned in my last post) a few minutes shy of 1800, turned on my handheld scanner so I could monitor the transmit frequency, and waited for the show to start. I also checked a few WebSDR receivers to see if I could detect how many hams were participating in the Hi Juno event.

Hi Juno Website
Hi Juno Website

The transmitter started up, but immediately I could see that it wasn't in sync with other stations that I could hear and see on the receiver. My shack PC is running Ubuntu 13.04 and it set up to automatically set its clock via NTP, but obviously it was off by quite a bit. So I had to duck into the shack quickly to manually update NTP, then come back to my laptop to restart hijuno.py via SSH. This time, I could see by following along with the interactive Hi Juno website and listening to my transmit monitor, that my timing was correct. As you can see above, the website had a nice graphical display of when to key up and key down for those doing this manually. That little yellow triangle at the bottom of the screen moved from left to right to indicate the current position within the transmit timing window.

W5ZA WebSDR
W5ZA WebSDR

At this point, satisfied that the Python script seemed to be working, I went back to WebSDR for a listen. The W5ZA 10 meter beacon receiver in Shreveport, Louisiana seemed to be a great choice for monitoring all the Hi Juno signals out there, probably because it was still in daytime, as opposed to the European receivers, which seemed to be showing nothing. Normally this would be considered bad, but I have to think in this case it was a good thing, since the ionosphere was probably not reflecting 10 meter signals back to Earth in this part of the world, and they were free to make it to Juno. To the left, you can see a screen capture of the W5ZA WebSDR just after a Hi Juno keydown period.

The rest of the event was fairly...uneventful. The Python code worked perfectly and stopped transmitting at the right time. It was fun chatting on Twitter with other hams who were also participating in the event. Based on watching the WebSDR waterfall and checking Twitter search, it seemed like there were quite a bit of us taking part in the event. I have no idea, how long it will take for us to hear back from the investigators whether this worked or not, but I hope it's fairly soon. I'm definitely looking forward to getting a QSL. My first one from an interplanetary spacecraft. I also have to say that the Hi Juno website worked wonderfully during the event with its simple and clear graphic instructing you when to transmit, and showing you transmit window. if we ever get more opportunities to participate in experiments like this in the ham community, it should be a model on how to run things. Even though we didn't get any immediate gratification, it was a fun event and I hope that NASA/JPL reaches out to us again in the future.

Fancier Minecraft Pi Game of Life

[pe2-image src="http://lh5.ggpht.com/-8gmHvJD2jyc/USH1uxHIu4I/AAAAAAAAC1I/ptVGYQdSylk/s144-c/IMG_20130218_012053.jpg" href="https://picasaweb.google.com/100175922620194527589/InstantUpload#5846223975109671810" caption="IMG_20130218_012053.jpg" type="image" alt="IMG_20130218_012053.jpg" ]

I spiffed up my last bit of Minecraft Pi Edition code by making the Game of Life fit into a smaller area of the world, making the world grid and live cells easier to see (by making dead cells Obsidian and live cells Diamond Blocks), and even adding a nifty little stepped wall around the playing field. In the two photos, you can see the new Game of Life as seen from the ground inside of the playing field and hovering above it. It runs a fair amount faster now that it's only updating a 64x64 grid. Still not going to break any speed records (even from 1980) but it's a bit more fun to play with now.

[pe2-image src="http://lh3.ggpht.com/-h_Gc8m4Sl-Y/USH1qSqIu3I/AAAAAAAAC08/EO05SuWfSE4/s144-c/IMG_20130218_011949.jpg" href="https://picasaweb.google.com/100175922620194527589/InstantUpload#5846223898215496562" caption="IMG_20130218_011949.jpg" type="image" alt="IMG_20130218_011949.jpg" ]

 

# pilife.py
#
# Jason Milldrum
# 18 Feb 2013
#
# www.nt7s.com/blog

import minecraft.minecraft as minecraft
import minecraft.block as block
import numpy
import random

mc = minecraft.Minecraft.create()

# World size in x and z axes
#worldSize = 64

# Bounds of x and z axes
negLimit = -32
posLimit = 31

# Bounds of y axis
yNegLimit = -64
yPosLimit = 64

# Y coord of Life world floor
worldFloor = 0

# Number of steps in the surrounding wall
maxWallHeight = 5

# Initialize the Life world
theWorld = numpy.zeros((posLimit - negLimit + 1, posLimit - negLimit + 1), dtype=numpy.bool)
theNextWorld = numpy.zeros((posLimit - negLimit + 1, posLimit - negLimit + 1), dtype=numpy.bool)
for x in range(posLimit - negLimit):
	for y in range(posLimit - negLimit):
		theWorld[x][y] = random.randint(0,1)

# Clear everything at the world surface and above inside the Life play area
mc.setBlocks(negLimit - (maxWallHeight * 2), worldFloor, negLimit - (maxWallHeight * 2), posLimit + (maxWallHeight * 2) - 1, yPosLimit, posLimit + (maxWallHeight * 2) - 1, block.AIR)

# Let's create stairsteps around the Life world

# Start with the +x direction
x = posLimit
stepHeight = worldFloor
# Up
while stepHeight <= maxWallHeight:
	mc.setBlocks(x, worldFloor, negLimit - stepHeight - 1, x, stepHeight, posLimit + stepHeight, block.BEDROCK)
	x += 1
	stepHeight += 1
# Down
stepHeight = maxWallHeight
while stepHeight >= worldFloor:
	mc.setBlocks(x, worldFloor, negLimit - stepHeight - 1, x, stepHeight, posLimit + stepHeight, block.BEDROCK)
	x += 1
	stepHeight -= 1

# Now the -x direction
x = negLimit - 1
stepHeight = worldFloor
# Up
while stepHeight <= maxWallHeight:
	mc.setBlocks(x, worldFloor, negLimit - stepHeight - 1, x, stepHeight, posLimit + stepHeight, block.BEDROCK)
	x -= 1
	stepHeight += 1
# Down
stepHeight = maxWallHeight
while stepHeight >= worldFloor:
	mc.setBlocks(x, worldFloor, negLimit - stepHeight - 1, x, stepHeight, posLimit + stepHeight, block.BEDROCK)
	x -= 1
	stepHeight -= 1

# Next the +z direction
z = posLimit
stepHeight = worldFloor
# Up
while stepHeight <= maxWallHeight:
	mc.setBlocks(negLimit - stepHeight - 1, worldFloor, z, posLimit + stepHeight, stepHeight, z, block.BEDROCK)
	z += 1
	stepHeight += 1
# Down
stepHeight = maxWallHeight
while stepHeight >= worldFloor:
	mc.setBlocks(negLimit - stepHeight - 1, worldFloor, z, posLimit + stepHeight, stepHeight, z, block.BEDROCK)
	z += 1
	stepHeight -= 1

# Finally the -z direction
z = negLimit - 1
stepHeight = worldFloor
# Up
while stepHeight <= maxWallHeight:
	mc.setBlocks(negLimit - stepHeight - 1, worldFloor, z, posLimit + stepHeight, stepHeight, z, block.BEDROCK)
	z -= 1
	stepHeight += 1
# Down
stepHeight = maxWallHeight
while stepHeight >= worldFloor:
	mc.setBlocks(negLimit - stepHeight - 1, worldFloor, z, posLimit + stepHeight, stepHeight, z, block.BEDROCK)
	z -= 1
	stepHeight -= 1

# Set the player right in the middle of the world
mc.player.setPos(0, worldFloor, 0)

# Main processing loop
while True:
	# Display theWorld
	for x in range(posLimit - negLimit):
		for y in range(posLimit - negLimit):
			if theWorld[x][y] == True:
				mc.setBlock(x + negLimit, worldFloor, y + negLimit, block.DIAMOND_BLOCK)
			else:
				mc.setBlock(x + negLimit, worldFloor, y + negLimit, block.OBSIDIAN)

	# Check number of neighbors alive
	for x in range(posLimit - negLimit):
		for y in range(posLimit - negLimit):
			
			if x == 0:
				xMinus = posLimit - negLimit
			else:
				xMinus = x - 1

			if x == posLimit - negLimit:
				xPlus = 0
			else:
				xPlus = x + 1

			if y == 0:
				yMinus = posLimit - negLimit
			else:
				yMinus = y - 1

			if y == posLimit - negLimit:
				yPlus = 0
			else:
				yPlus = y + 1

			alive = 0			

			if theWorld[xPlus][yPlus]:
				alive += 1
			if theWorld[x][yPlus]:
				alive += 1
			if theWorld[xMinus][yPlus]:
				alive += 1
			if theWorld[xPlus][y]:
				alive += 1
			if theWorld[xMinus][y]:
				alive += 1
			if theWorld[xPlus][yMinus]:
				alive += 1
			if theWorld[x][yMinus]:
				alive += 1
			if theWorld[xMinus][yMinus]:
				alive += 1

			# Calculate which cells live and die in next generation
			if theWorld[x][y] == False:
				if alive == 3:
					theNextWorld[x][y] = True
			else:
				if alive < 2:
					theNextWorld[x][y] = False
				elif alive > 3:
					theNextWorld[x][y] = False
				else:
					theNextWorld[x][y] = True
	# Copy array
	theWorld = theNextWorld.copy()

Conway's Game of Life in Minecraft Pi Edition

Also known as "The slowest implementation of the Game of Life in 2013". This is what happens when you have insomnia.

[pe2-image src="http://lh3.ggpht.com/-pIMUUJNK__k/URzHojTQSAI/AAAAAAAACxQ/MCZeLIlpebE/s144-c/IMG_20130214_031548.jpg" href="https://picasaweb.google.com/100175922620194527589/InstantUpload#5844765915904755714" caption="IMG_20130214_031548.jpg" type="image" alt="IMG_20130214_031548.jpg" ]

What I did was first clear out the entire world, then place a plane of glass across the entire world at y=1. The actual Life cells are Cobblestone blocks on the y=0 plane (the grid is on the Minecraft x-z axis). The Life grid is initialized with a random seed, then set off to work. This code for the Game of Life is about the dumbest and slowest implementation there is. I've done no optimization at this point. It only updates about one generation (over the entire world) every few minutes. But it does seem to work, as you can see above.

Next time I need a break from electronics, I'll refine the code and post it again (or you can follow the Gist). It's way too slow to run the entire world as a Life simulation, so I think I'll just clear out a 64x64 space in the middle of the world and confine the world to that size, which should make things run about an order of magnitude faster, I would hope. I know, this is crap code, but I'm still trying to really get into the Python frame of mind and this was a quick hack any way.

I'll let this thing run for a while and post a screenshot of the evolved world to Twitter and G+ later on. Also, thanks to the shoutout from the new http://mcpipy.wordpress.com/ blog!