Wideband Transmission #8

Another 10 mW WSPR Beacon

I enjoy writing up my projects, but it's much better to get feedback to see that someone was actually able to take my writing and successfully duplicate my project. Via the Etherkit Twitter account, I received this from Tom Hall, AK2B regarding my last posted project:

Awesome work! Tom has been a great supporter of Etherkit from the beginning and I'd like to thank him for sharing his neat creations with the rest of us. It's wonderful to see such a minimalist design perform so well!

More Coding Resources for Fun

I haven't had a ton of free time here, but I do get snippets of time occasionally where I can sit with my notebook PC for a bit and mess around. As mentioned in some recent posts, I've been revisiting coding for fun, and I've stumbled upon quite a bit of new resources that are new to me and that I thought would be good to share.

The first one I'd like to mention is called Scratchapixel. I was curious about the mathematical methods behind 3D rendering, and some searching brought me to this exhaustive tutorial site. It's not 100% complete yet, but most of the fundamentals of 3D graphics are already well-explained there. A fantastic resource if you are curious about the first principles of 3D rendering like me.

A related site is called Shadertoy. Not by the same people, but also related to the topic of learning 3D programming. Shadertoy is a web application that lets you play with shaders in C++ inside a web IDE that can be updated on-the-fly. It takes a bit of CPU and graphics horsepower to run comfortably, but if you've got the capacity, it's worth browsing the demos on the site just to see the cool stuff you can create with it. This tool was created by Íñigo Quílez, who also has a really cool home page with lots of tutorials and whitepapers. If you like demoscene stuff, then definitely check it out.

Another neat find that I only recently discovered goes by the name of Rosetta Code. It bills itself as a programming chrestomathy site, which basically means that it's a way to learn how programming languages are related in a comparative way. There is a large directory of different programming tasks, and each task page lists ways to implement a solution in a wide variety of languages. It's a wiki, so not every page has every language implementation, but there's obviously a ton of work put into the site, and most tasks have implementation in the major languages. Really fascinating and useful at the same time.

Finally, there's The Nature of Code. This site hosts a free e-book download of the content, and provides a link to purchase a dead tree version if you wish. Here's how the website describes the book:

How can we capture the unpredictable evolutionary and emergent properties of nature in software? How can understanding the mathematical principles behind our physical world help us to create digital worlds? This book focuses on the programming strategies and techniques behind computer simulations of natural systems using Processing.

That sounds right up my alley. I haven't read the book yet, but I have skimmed it a bit, and it looks like the kind of things that I love: non-linear systems, physics simulations, fractals, and the like. When things settle down here a bit, I may tackle the book and re-write the sample code into Python. That would give me some more Python practice and force me to really think about the algorithms behind the text, not just blindly copying, pasting, and executing the scripts.

Let me know in the comments if you found any of these links useful or fascinating, or better yet if you know of other links in the same vein.

New Miles-Per-Watt Record Opportunity?

If you regularly follow science news, you may have heard of the Breakthrough Starshot initiative. In short, this is a study to create pathfinding technology that would allow the eventual launch of micro-lightsails with tiny mass to the Alpha Centauri system at a significant velocity (0.2c!) with a ground-based laser array. It's probably a serious effort, as it is being privately funded to the tune of a whopping $100,000,000. No doubt, an extremely audacious undertaking.

Sounds interesting, but what does this have to do with radio? Well, obviously there's the issue of how you can get a usable signal back to Earth across a distance of 4-and-a-half lightyears from a craft that masses in 10s of grams. I was wondering about that exact engineering challenge when I came across this article in my feed reader today. It turns out that someone has studied how one might use the Sun as a gravitational lens for lightwave communication across interstellar distances. Claudio Maccone, an Italian physicist, has run an analysis and has determined that putting a receiver at distance of at least 550 AU from Sol will give the desired lensing effect for optical communications.

Speaking before Maccone at the Breakthrough Discuss meeting, Slava Turyshev (Caltech) pointed out that the gain for optical radiation through a FOCAL mission is 1011, a gain that oscillates but increases as you go further from the lens. This gives us the opportunity to consider multi-pixel imaging of exoplanets before we ever send missions to them.

That's kind of amazing. Maccone calculates that the bit error rate of optical communication from at any significant distance from Sol quickly degrades to around 0.5. However, by using the Sun as a lens, the BER stays at 0 out to a distance of 9 LY. Here is a graph of the effect of standard comms and those enhanced by using the Sun as a gravitational lens, as calculated by Maccone:

fig024

What's really crazy is this next paragraph:

But as Maccone told the crowd at Stanford, we do much better still if we set up a bridge with not one but two FOCAL missions. Put one at the gravitational lens of the Sun, the other at the lens of the other star. At this point, things get wild. The minimum transmitted power drops to less than 10-4 watts. You’re reading that right — one-tenth of a milliwatt is enough to create error-free communications between the Sun and Alpha Centauri through two FOCAL antennas. Maccone’s paper assumes two 12-meter FOCAL antennas. StarShot envisions using its somewhat smaller sail as the antenna, a goal given impetus by these numbers.

So that would have to rate as the ultimate QRP DX, eh? I'm not sure how realistic any of this is, but I'm pretty sure the physics are well-established by now. Kind of makes the Elser-Mathes Cup look like small potatoes.

 

More Strange Attractors

As I mentioned toward the end of my post on the Python/GTK+ implementation of the Lorenz attractor, I ultimately wanted to add some of the other strange attractors to my program in order to make it a bit more interesting. So I did just that. It also gave me a good excuse to learn a bit more about how to use and layout widgets in a GTK+ application.

In addition to the Lorenz attractor, the program now will generate the Rössler Attractor and a plot of the behavior of Chua's Circuit. The Rössler Attractor is another well-known system in the world of chaos theory, which you can learn more about by following the above link. Since I'm only plotting a 2D section of each attractor, I have to decide which view to display. In the case of the Rössler, I thought that the X-Y view was better than the X-Z view.

More interesting to me is the simulation of Chua's Circuit, as this is based on an actual analog circuit you can build. The circuit is a chaotic oscillator that consists of the usual L-C elements (and a resistor plus limiting diodes) along with a nonlinear negative resistance circuit element. The negative resistance element is usually implemented with an active device such as an op-amp, although it has been reported that a memristor can also serve this function. The simulation is a system of three ordinary differential equations, much like the Lorenz or Rössler systems, but with a function in the first ODE to represent the behavior of the nonlinear negative resistance element. You can see in the code listing below that this was easy to implement in Python with a lambda function. It's cool to see the pattern drawn on a display, but I think it would be much better to have an actual circuit render it on an analog oscilloscope. One day, I hope to do that, but in the mean time, enjoy these videos of the behavior of such a circuit.

As far as my additions to the Python code, I created a GTK DrawingArea for each attractor, then added them to a Stack, which allows them to be switched with the StackSwitcher widget at the bottom of the screen. For clarity, I also added a legend to each DrawingArea to display which axes are being rendered for each attractor, as the Rössler has a different view from the other two. This code is a bit longer than the initial iteration, but much of it is similar, since the calculation and plotting of each system is nearly the same (yes, I could have factored the code quite a bit, but this is just for fun). Another fun time with code was had!

Lorenz Attractor in Python

Back in the day, when I was a budding nerd in the late 80s/early 90s, I spent a lot of my free time down at the local public library looking for any books I could find regarding certain topics which captured my interest: programming, astronomy, electronics, radio, mathematics (especially the recreational mathematics books), and other topics in the realm of science.

One of my fondest recollections of that time was how accessible that home investigations into some of these topics became due to the advent of the personal computer. Of course, in those times, not every household had some kind of computing device like they do today, but PCs were affordable enough that even a lower middle class house like ours could scrape together enough for a computer with a bit of work.

We also didn't have widespread household Internet, so your options for getting new programs to play with were limited to the relatively expensive services like Compuserve, trying to find warez on bulletin board systems, checking out books and magazines with program listings from the public library, or perhaps if you were lucky, being able to check out a stack of 3.5 in floppies from the library (how were they able to do that with commercial software?). Of course, given the previously mentioned socio-economic status, I was mostly limited to the public library option. Although perhaps some day I'll tell the tale of how I talked my parents into letting me get Compuserve, and then proceeding to rack up a huge access bill on my parents' credit. Oops.

Of particular interest to me was the relatively new field of chaos theory and fractals. These studies were conceived a bit earlier than this time period, but were popularized in the public eye during this time. I got hooked in by the photos published in the groundbreaking books The Fractal Geometry of NatureChaos: Making a New Science, and The Beauty of Fractals. There were also plenty of articles in publications like Scientific American and the books of authors like Clifford Pickover. The wonderful thing about these resources is that many of them not only showed you the pretty photos of fractals and chaotic systems, but actually described and illustrated the algorithms behind them, which allowed you to code your own implementations at home.

During my early work in trying to recreate these forms in my own computer, I had an Atari 800XL with a dodgy floppy disk drive, which made any kind of coding a bit of an adventure as it seemed like my mass storage old successfully worked about half of the time (on a side note, who remembers the even earlier days, where you would type in a machine language monitor program in BASIC, and then transcribe strings of ML from a listing in a magazine in order to get a new program...good times). Things really got serious when we ended up acquiring a Tandy 1000 (I believe the RL version, but I'm not 100% certain about that). Once we had that in the house, I spent many late weekend nights trying to write code to reproduce the fascinating patterns found in the pages of those books. You know you're a true nerd when you get such an electric thrill from finally mastering the code to generate a Sierpinski triangle or Barnsley fern in glorious CGA on your own monitor.

So what's the point of this overwrought bout of nostalgia? Well, I was recently pining for the old days when you could just sit down at the PC and implement a chaos system in one quick setting with minimal fuss. Compiled languages with arcane GUI frameworks were right out. Fortunately, we are blessed with quite a few good replacements for the old BASIC environment. My favorite is Python, and since I use Linux Mint as my primary desktop OS, the GTK+ 3 libraries for Python are already included by default, so it's quite easy to get a rudimentary 2D graphics system up and running quickly.

For my first chaos system coding challenge, I decided to go with the great-granddaddy of chaos: the Lorenz attractor. It's an easy system to compute, and it's quite obvious if you get the implementation right or wrong. Once I got the hang of the GTK+ 3 library interface, it didn't take that long to bang out an implementation of the Lorenz attractor in relatively few lines of Python. The simplicity is satisfying, and reminds me of the fabled old days of coding.

There's the code if any of you would like to play around with it. It should be fairly easy to replicate if you are using any of the Debian-derived Linux distributions, and probably only marginally more difficult with other Linux distros. I have no idea what it would take to get running on Windows, so good luck if that's your OS of choice.

Now I have a framework to build off if I get a further itch for similar experiments. I'm already working on an extension to this code that will render other attractors. It would be fun to find a 3D rendering library that would be easy to use so that I could plot in three dimensions, but that's not hugely critical to me. This is just an exercise to have some fun, capture a bit of that nostalgia, and distract myself a bit during downtime. Hopefully a few of you kindred souls will have derived some enjoyment from this trip down memory lane.

Si5351A Investigations Part 4

I've got a lot of project ideas rattling around my head. Got even more of them in my notebook. One of the projects floating around in there for years that seemed pretty cool, but not urgent, was an automated thermal chamber for oscillator stability testing, roughly based on the one seen at the end of chapter 7 of Experimental Methods in RF Design.

My interest was renewed a few months ago when Jennifer brought home some free Styrofoam containers from the vet's office from Baxter's annual checkup. They were nice and thick, as well as having straight interior walls, unlike the typical cheap beer cooler you find at the supermarket. Of course, Jennifer was thinking of using them for food, but naturally I had more nefarious purposes in mind for one of them.

The real impetus to build the thermal chamber was the realization that it would be extremely helpful in characterizing the behavior of the Si5351A. Taking mental stock of what I had on hand, I realized that I already had almost everything I would need to do the job. So this last weekend, I decided to build the chamber and put it to use characterizing the Si5351A. In these tests, I used the Si5351A with a cheap garden-variety crystal: the ECS-250-8-30B-CKM.

I don't intend to do full write-up of the thermal chamber here (that will be coming in a separate post), but I will cover the basic design here briefly. The physical chamber is that veterinary Styrofoam cooler with rough dimensions of 25 x 30 x 30 cm. The device under test (DUT) sits in the bottom of the chamber. Over the top of that sits a shield constructed from pressboard and legs made from 2x2s which raises the shield to a height of about 9 cm from the floor of the container, and keeps the DUT from receiving much direct radiant heat. A 12 V PC cooling fan is mounted over a hole in the pressboard. There are also three other large holes on the outside edge of the pressboard, which allows the fan to circulate air between the upper and lower partitions of the chamber. The heating element is a 60 watt incandescent light bulb. A porcelain light socket is secured to the lid with a large cable tie, allowing the light to drop down into the top of the chamber. The weight of the porcelain light socket also helps to weight down the lid securely onto the container.

DSCN0658

On the electronics side, I chose an Arduino Uno clone (the Sparkfun Redboard) as the controller platform. Fortuitously, a little while ago I also happened to win a Seeed Studio Relay Shield, which worked perfectly in this application for switching the lightbulb and the fan. There was also a DS1821 One-Wire temperature sensor in my junkbox, which interfaced with the Redboard with a little bit of code I found online. An old two-conductor power cord and inline ATC fuse was used to provide power to the light bulb. Simple firmware was written for the Redboard that allows it to be commanded via the serial connection. The light and fan can be switched on or off via a single character command sent over the serial connection. Likewise, the temperature can be queried and sent to the PC via the serial connection.

The shack PC ties everything together via a Python script. My Rigol DSA815-TG spectrum analyzer is used as the frequency counter, since I can control it remotely via USB with the Python usbtmc library. My control script reads the temperature and frequency on an interval (I've been using 15 or 30 second intervals) and has logic to control the light and fan based on the readings. I will post the code to GitHub when I write a full post about the chamber.

Now that you know how the system works, let's look at what I found with the Si5351A as the DUT. After doing some initial tweaking of the system to get it working the way I expected, on my first true run, I set up a simple temperature profile. After a 4 minute idle period, the light and fan was commanded to turn on until the temperature reached 60C, then the light was turned off, and I cracked open the lid of the chamber a bit so that it could cool off relatively quickly. The most noticeable thing is the double-humped response in the frequency. You can see the typical frequency reduction as temperature increases, but then around 52C, the trend reverses! I'm not quite sure what to make of it. But I must say that total frequency excursion of about 70 Hz over 35 degrees of temperature change looks pretty nice to me.

5351SecondRun

Next, Thomas LA3PNA suggested in the Etherkit IRC channel that I do a long run with no extra heating so I can get an idea of the warm-up drift and the long term stability of the oscillator in a temperature stable environment. That's a very useful thing to know, so I reconfigured the Python control program to do that. As you can see from the plot below, after a small amount of drift in the first 10 minutes or so, the Si5351A is extremely stable. Those excursions that you see from the main trend line are only 1 Hz, so those may be due to the oscillator or to error in the frequency measurement, but either way I'd say that's rock-solid. You can even notice that the temperature of the chamber was a bit high from the previous run and settled down slowly to ambient, but that had no noticeable effect on the stability.

5351SelfHeating

After that, the Python program was rewritten to ramp up temperature to 40C, then try to hold it there by toggling the light on and off if the temperature deviated more than +/- 1 degree. I wasn't completely happy with the control loop in this one (I used 30 second intervals, but it should have been shorter) but the graph was still instructive. This time, the frequency response looks about how one would expect with this type of temperature profile.

5351-40C

Finally, I again tweaked the control algorithm in order to tighten up the measurement interval to 20 seconds and the maximum temperature excursion to +/- 0.5 degrees. The hold temperature was set for 50C, which is close to where that odd inflection in the drift appears. You can see that the control at 50C is much better here. You will also notice that blip where the positive temperature coefficient appears to go negative. Still, it holds relatively stable at 50C.

5351-50C-2

There is not a lot of data out on the internet to use for a comparison against this data. However, I believe it's fair to say that the Si5351A looks pretty solid from a stability standpoint. It's doubtful one will be operating a radio under such extreme temperature excursions in almost any case, but even so, <100 Hz of drift seems tolerable for almost any application where one is using a conversational operating mode. Of course there is still some more data which could be collected, such as performance at low temperatures, but from this initial investigation, I would say that things look very promising for the Si5351A.

11 January 2015 Update

I finally received a small supply of the TCXO that I have been planning on using with the Si5351A for a while now: the FOX924B-25.000. In the interest of comparing the performance of the TCXO against the standard crystal, I ran the same thermal chamber temperature profile as the last one above, although I removed the lid at the end of the 50C cycle to get a steeper cooldown gradient.

TCXO-Run1

As you can see, the TCXO stability is approximately an order of magnitude better than the crystal. The maximum frequency deviation is 9 Hz, although that occurs at the point where the lamp is turned off, so the frequency response is somewhat like the first derivative of the temperature curve. Once temperature is stabilized near 50C, the TCXO control loop does a great job maintaining frequency only a few Hz higher than the room temperature frequency. This TCXO should be suitable for nearly all but the most demanding applications. Certainly it would be excellent for WSPR/QRSS work and for portable outdoor ops like SOTA.

Hi Juno After-Action Report

As I write this, the Juno spacecraft has completed its slingshot maneuver around Earth, having stolen a bit of Earth's rotation energy. and is now on its way out to Jupiter. A bit before the designated 1800 UTC start time for the event, I was able to set up my Icom IC-718 at the appointed frequency of 28.324 MHz with an output power of approximately 60 watts CW.

I executed the hijuno.py script via SSH (as mentioned in my last post) a few minutes shy of 1800, turned on my handheld scanner so I could monitor the transmit frequency, and waited for the show to start. I also checked a few WebSDR receivers to see if I could detect how many hams were participating in the Hi Juno event.

Hi Juno Website
Hi Juno Website

The transmitter started up, but immediately I could see that it wasn't in sync with other stations that I could hear and see on the receiver. My shack PC is running Ubuntu 13.04 and it set up to automatically set its clock via NTP, but obviously it was off by quite a bit. So I had to duck into the shack quickly to manually update NTP, then come back to my laptop to restart hijuno.py via SSH. This time, I could see by following along with the interactive Hi Juno website and listening to my transmit monitor, that my timing was correct. As you can see above, the website had a nice graphical display of when to key up and key down for those doing this manually. That little yellow triangle at the bottom of the screen moved from left to right to indicate the current position within the transmit timing window.

W5ZA WebSDR
W5ZA WebSDR

At this point, satisfied that the Python script seemed to be working, I went back to WebSDR for a listen. The W5ZA 10 meter beacon receiver in Shreveport, Louisiana seemed to be a great choice for monitoring all the Hi Juno signals out there, probably because it was still in daytime, as opposed to the European receivers, which seemed to be showing nothing. Normally this would be considered bad, but I have to think in this case it was a good thing, since the ionosphere was probably not reflecting 10 meter signals back to Earth in this part of the world, and they were free to make it to Juno. To the left, you can see a screen capture of the W5ZA WebSDR just after a Hi Juno keydown period.

The rest of the event was fairly...uneventful. The Python code worked perfectly and stopped transmitting at the right time. It was fun chatting on Twitter with other hams who were also participating in the event. Based on watching the WebSDR waterfall and checking Twitter search, it seemed like there were quite a bit of us taking part in the event. I have no idea, how long it will take for us to hear back from the investigators whether this worked or not, but I hope it's fairly soon. I'm definitely looking forward to getting a QSL. My first one from an interplanetary spacecraft. I also have to say that the Hi Juno website worked wonderfully during the event with its simple and clear graphic instructing you when to transmit, and showing you transmit window. if we ever get more opportunities to participate in experiments like this in the ham community, it should be a model on how to run things. Even though we didn't get any immediate gratification, it was a fun event and I hope that NASA/JPL reaches out to us again in the future.

Hi Juno!

As a world-class procrastinator, I know I'm very late with this post only about 12 hours before the event. However, I still wanted to share it with you in the hopes that maybe it could help one person.

As you may have heard, the Juno spacecraft will be making a close approach to Earth on 9 October 2013 as it slingshots to gain energy for the trip to Jupiter. The investigators who are in charge of a radio receiver on the spacecraft wish to see if they can detect intelligent life on Earth who may be transmitting on the 10 meter band. Therefore, they are asking licensed radio amateurs to transmit a slow-speed CW "HI" signal to Juno during a window at Juno's closest approach. The full details are on the Hi Juno page (due to the US government shutdown, the primary page is offline, but the event is still planned to take place).

BWFZtUsCcAA8RgFIn order to be able to take part in this event without having to be right at the transmitter (I have to take care of my two toddler boys during the specified time period), I wrote a program in Python which will automatically transmit at the appropriate time. You just need a PC synchronized to NTP time, a 10 meter CW transmitter, a serial port, and a keying interface (which I will describe shortly). I plan to execute the program on my shack PC via SSH and monitor my transmissions on a portable receiver to maintain control of the transmitter.

Serial Port Keying Circuit
Serial Port Keying Circuit

Here is the simple keying circuit I use to key my Icom IC-718. It should work with just about any grounded keying transmitter, but as usual your mileage may vary. I use a DB9 female jack for the serial port. The RTS line is used to turn on a 2N7000 MOSFET, which will ground the key line in order to transmit. You can use any key jack that is appropriate for your transmitter. I use this circuit with a USB-to-Serial adapter, and it still seems to work fine.

The actual Python program to control the serial port keyer is found here at GitHub. You will need to have the PySerial module installed on your system, in addition to the regular Python installation. I've tested it here, but please be sure to test it yourself on a dummy load before using it on the air (you will need to temporarily change the START_DATE variable to an earlier time in order to get the program to transmit). You will also need to change the DEVICE, BAUD, and CALLSIGN variables to values appropriate for you. Linux/OS X users would change DEVICE to whichever "/dev/tty*" port is appropriate, where the * is your port numbe. Windows users would use "COM*", where * is the COM port number. Sorry that I can't hold your hand through this, but it should be fairly simple to get running. Linux and OS X users may also have to execute the program under sudo in order to access the serial port.

Please let me know if you end up using this, and don't forget to request a QSL from Juno!

Conway's Game of Life in Minecraft Pi Edition

Also known as "The slowest implementation of the Game of Life in 2013". This is what happens when you have insomnia.

[pe2-image src="http://lh3.ggpht.com/-pIMUUJNK__k/URzHojTQSAI/AAAAAAAACxQ/MCZeLIlpebE/s144-c/IMG_20130214_031548.jpg" href="https://picasaweb.google.com/100175922620194527589/InstantUpload#5844765915904755714" caption="IMG_20130214_031548.jpg" type="image" alt="IMG_20130214_031548.jpg" ]

What I did was first clear out the entire world, then place a plane of glass across the entire world at y=1. The actual Life cells are Cobblestone blocks on the y=0 plane (the grid is on the Minecraft x-z axis). The Life grid is initialized with a random seed, then set off to work. This code for the Game of Life is about the dumbest and slowest implementation there is. I've done no optimization at this point. It only updates about one generation (over the entire world) every few minutes. But it does seem to work, as you can see above.

Next time I need a break from electronics, I'll refine the code and post it again (or you can follow the Gist). It's way too slow to run the entire world as a Life simulation, so I think I'll just clear out a 64x64 space in the middle of the world and confine the world to that size, which should make things run about an order of magnitude faster, I would hope. I know, this is crap code, but I'm still trying to really get into the Python frame of mind and this was a quick hack any way.

I'll let this thing run for a while and post a screenshot of the evolved world to Twitter and G+ later on. Also, thanks to the shoutout from the new http://mcpipy.wordpress.com/ blog!

Exploring Minecraft Pi Edition

If you are a Raspberry Pi enthusiast, you may have seen that Minecraft Pi Edition was officially released yesterday. I don't have the time to game like I used to, so I haven't really played Minecraft, but this version looked intriguing since it's free and it has an open API. So I downloaded it yesterday during a break when both of the boys were napping and give it a quick run. The performance of the game is surprisingly responsive, which shows that the GPU in the Pi is fairly capable, even if stock Raspian X Windows is slow.

With a bit of digging into the very sparse API docs included with the program, and a little Internet help, I was able to get a bit of code up and running. All it does is create a sphere 10 blocks away from the player's location in the Z direction. Here's the quick and dirty code:

You can see the results in this photo:

[pe2-image src="http://lh3.ggpht.com/-mpxhCovV3jw/URqfv6RtGBI/AAAAAAAACwY/jwlRM6KiZb0/s144-c/IMG_20130212_120032.jpg" href="https://picasaweb.google.com/100175922620194527589/InstantUpload#5844159111912822802" caption="IMG_20130212_120032.jpg" type="image" alt="IMG_20130212_120032.jpg" ]

Pretty fun stuff, even if it's very basic. I know that the hardcore MC fans have already been scripting some pretty fantastic stuff in the PC version. It should be interesting to see what people do with the Pi version.