Let’s Encrypt!

Encryption is in the news again. Various three letter government organizations want to have backdoors in devices like cell phones for surveillance. Of course that means that with a back door or exploit into an operating system or application anyone can track traffic from these devices. Trying to limit it to “lawful interception” would be impossible.

Encryption has two significant rolls; security from third party viewing the traffic and authentication so you have some confidence that you are talking to the right party. Having traffic in the clear, without encryption means that your communications can be easily captured and your session could be spoofed. You certainly don’t want your web sessions with your bank in the clear where a nefarious party can watch your traffic and even spoof your session to transfer your funds to them. Internet commerce would not work without encryption.

The encryption method that web sites use is called HTTPS. It uses a protocol called TLS to set up an encrypted session between you and the web site. The nice thing about HTTPS and TLS, is that it can use a number of different strong ciphers to make it pretty difficult for third parties to sniff your traffic. It also uses a “chain of trust” system in order to have some authentication that the web site you are using is really the site you think it is.

Up until recently, setting up HTTPS and acquiring and setting up the certificate for the web site has not been for the faint of heart. It also has been pretty expensive. Purchasing a certificate can run between $250 and $500 a year. Your personal web site or even a small company, may not have the coin to purchase a certificate. As such, many sites have opted not to run HTTPS and will run the more common and insecure HTTP protocol. This is where Let’s Encrypt comes into this story.

To quote from Let’s Encrypt’s web site:

Let’s Encrypt is a free, automated, and open certificate authority (CA), run for the public’s benefit. Let’s Encrypt is a service provided by the Internet Security Research Group (ISRG). The ISRG is a non-profit with the mission to “reduce financial, technological, and education barriers to secure communication over the Internet.”

Let’s Encrypt is doing just that. It addresses the speed bumps to creating secure communications; it is free and it is simple. For most operating systems and web servers, it just means downloading the Let’s Encrypt software, running it and restarting the web server. Your site would be up and running with a valid HTTPS session. Although this is true for most Linux distributions, it isn’t quite there for UNIX-like systems like FreeBSD of which this site uses. It did take me a bit more hacking around to get this to work. Googling around, you can find out how to get this software to work on FreeBSD as well as how to configure your web server (eg. Apache on this box) to use Let’s Encrypt’s certificate as well as updating when the cert expires.

One of the nice things about Let’s Encrypt is the process to prove who you are to the Let’s Encrypt process. Normally with any other certificate authority, it would mean email, phone calls, etc back and forth a number of times. This can take hours or days to process. The Let’s Encrypt process just requires you take down your web site for the short period you run the Let’s Encrypt client. Running the client will put up a little website that the Let’s Encrypt servers will validate against. If you have control over your domain, this process will work and the Let’s Encrypt servers will hand back a certificate for your web site that is good for 90 days. From then on, you just run the client software say once a month to update the certificate. Any operating system has a way to running scheduled applications such as “cron” for Linux/FreeBSD. Oh… and all of this is free.

So there shouldn’t be an excuse to running a non-encrypted web site now. Protect yourself and your users by using HTTPS with Let’s Encrypt.

[A very nice page detailing how to install and renew Letsencrypt certs for Ubuntu can be found at: https://www.digitalocean.com/community/tutorials/how-to-secure-apache-with-let-s-encrypt-on-ubuntu-14-04 – Tim]

Ubiquiti Rockets and a 50Km Path Over Water…

FarallonsThis last fall, we put in a 50Km 5.8Ghz link from the center of San Francisco (Twin Peaks) to the South East Farallon Island lighthouse using Ubiquiti Rockets. At first the link was unusable. This was mainly due to the fact that the long distance and shooting over water causes the received signal to vary wildly. This cased the radios to frequently and rapidly try to change the MCS (modulation scheme) and would make the link very lossy. Here are some settings I had to settle on to get the links to work.

  • Do not enable auto-negotiate for the signal rate on long links. The radios will auto negotiate data rates when the receive signal level changes. This will momentary drop the link while the ends sync up. If the signal is bouncing frequently this will make the link pretty lossy or not usable at all.
  • Long links or links that are being interfered with will likely have problems with modulation schemes that have an amplitude component such as QAM. If so, use a modulation scheme that doesn’t have an amplitude component like BSFK where you can leverage “Capture Effect“. This would be MCS0 (1 chain) and MSC8 (2 chains).
  • Fix the distance of the link to about 30% over the calculated distance. The auto-magic calculation that AirOS does typically is wrong with long links.
  • Turn off AirMax on Point to Point links. AirMax is used to manage multiple clients on one AP more fairly. Not needed for P2P.
  • Use as narrow of a channel you can support for the bandwidth you need. As per the AirOS manual…
Reducing spectral width provides 2 benefits and 1 drawback.
  • Benefit 1: It will increase the amount of non-overlapping channels. This can allow networks to scale better
  • Benefit 2: It will increase the PSD (power spectral Density) of the channel and enable the link distance to be increased
  • Drawback: It will reduce throughput proportional to the channel size reduction. So just as turbo mode (40MHz) increases possible speeds by 2x, half spectrum channel (10MHz), will decrease possible speeds by 2x.

Airfields of Yesteryear

An older post of mine talked about looking back at history with the little geodetic survey benchmarks you see in the sidewalk and at the base of older buildings. Modern archaeology has always interested me and if you are interested too, there is a wonderful site documenting abandoned airports around the US named “Abandoned & Little-known Airfields”. It covers the history and evidence left behind when general aviation was more popular and strange little military operations that were out in the middle of nowhere.

As I have just spent the last 25 or so years living in San Francisco, it was a surprise to find out about strips that I didn’t know about such as the Bay Meadows Airport in San Mateo and the Marina Airfield next to Crissy Field in San Francisco. Marina Airfield was the first terminus of the United States Post Office Department Trans-Continental Air Mail Service.

Growing up in Fresno, I remember the remnants of Furlong Field just out Shaw Avenue. Good to see it documented here so it isn’t forgotten as development has pretty much obliterated any trace of it.

Sad to see so many fields disappear with the wane of general aviation in this country. It is just too expensive for most to own or lease a plane and keep it up. Land is being sold to developers as cities can see better tax revenue with a shopping center than an air strip.

Small form-factor broadcast console…

Mackie set the standard in inexpensive, small form-factor recording and sound consoles. I own a 1402 VLZ console that fits in a small brief case and sounds great. The problem is that it is the wrong console for most of the work I do that will need a console. Coming from the broadcast side of the world and not the recording side, I want things like a cue buss that sits at the end of the fader travel, or the control room monitors to mute when I turn on the mike so I don’t get feedback. I want logic that I can switch a CD player into play when I bring up the fader or hit a start button. None of these “features” are typically required on recording and sound consoles and that was where the biggest market for companies like Mackie are.

Allen & Heath, a respected name in recording consoles, has just come out with their first stab at a broadcast console in the same sort of form-factor as the Mackie 1402. It is called the XB-14 and has most of the Bells and Whistles that I have been looking for. I have been told by Mark Haynes at Leo’s Pro Audio that they should have one in next week to test drive and I am looking forward to seeing if they got it right.

One “down” side of the console is the price. It is selling at just under $1,400. I have seen it advertised at $1,200. The Mackie 1402 is running around $500. I can see that the XB-14 has a some extra features to make it more of a broadcast desk, but $700 more? I hope some of these boxes sell to encourage folks like Mackie to compete for this market.

Great tool for checking Line of Sight…

Google Maps has opened up access to resources that would take considerable work and expense to access. Just purchasing software that can do ray tracing over a geographic area 10 years ago would have cost tens of thousands of dollars. Now “HeyWhatsThat” has leveraged Google Maps to do just this and it is free.

Now, why would I be so interested in this site? Being a bit of a wireless geek, it is a great starter tool to understand how much coverage area a mountain top has. In the example shown in the right you can see the coverage area from the Twin Peaks communications site in San Francisco. The orange/red overlay indicates area that this site can see. You can see the shadowing of some of the hills of San Francisco affecting the coverage area.

At the top of the frame, shows a panorama of the skyline seen from that site. The list on the right shows what mountain tops can been seen and distance to them.

HeyWhatsThat is a great starting point in checking out coverage area. I wouldn’t throw away your $50,000 coverage software just yet as that will be a bit more accurate using better algorithms to calculate coverage such as Longley Rice and TIREM as well as their own tweaks.

Major Rant – Title 24

In its every intention, Title 24 tries to reduce power consumption for new and remodeled buildings. In its current structure, it can make it worse.

I really have been trying to be aware in my design and purchase of lighting in our new kitchen remodel. I have been looking at every different lighting option and in particular, LED in the assumption that anything that doesn’t have a “heater” in it and creates waste heat, is good (BTW, this is a whole other blog regarding LED lighting). One of the first things I ran into in my design is California’s Title 24 requirements. At least up to August of this year, California has standards that have a rather strange way of promoting and calculating effective power usage for kitchens.

  1. There is no limit on the power you can put into the lighting of a kitchen. (a bad thing)
  2. The wattage allocated for high efficacy lighting must be 50 percent or more of total lighting wattage. (a bad thing)
  3. Any fixture that can take non-high efficacy lighting devices like incandescent will be counted at the maximum wattage of the fixture. (a bad thing)
  4. High efficacy lighting is based on the number of lumen per watt (a good thing)

What this means is if you have 200 watts of low efficient lighting, you must have 200 watts of high efficacy lighting which makes no sense at all. In fact it could force you into putting more high efficacy lighting in that you need. In addition to this, the old screw-type Edison base for light bulbs is a standard. There are many more compact florescent bulbs out there designed for this base than for a proprietary pin-type base for recessed lighting. Guess which one is cheaper? In order to “comply” with title 24, you need to use non-standard fixtures.

If they really want to “fix” this, they could just limit the number of watts per square foot and strike out this silliness with standard lighting fixtures.

The Farallon Islands’ Web Cam

For some time now, as someone has had an objective to get broadband in remote areas of the world, I have been looking at some lonely islands 50Km off the coast of San Francisco known as the Farallon Islands. Back in 2001 or so, at a past Bay Area Wireless Users Group meeting, Simon Barber, suggested that we hook the island up as the main island is staffed and they just had basic two-way radios for communication to the main land. For various reasons it never quite happened until this year when a number of different interests and funding fell into place.

I was introduced to folks at AirJaldi who were looking for locations in the Bay Area to test their radio deployments. I have access to a number of hill tops around the Bay Area and suggest to them that we put a link into the Farallons. I called US Fish and Wildlife and was pointed to the Point Reyes Bird Observatory as they do the day-to-day operations and science on the islands. At the same time I reach out to them, the California Academy of Science was looking to put a high-definition web cam out on the island to stream back to the public. Bingo, we have funding and very interested parties that want fast bandwidth to the island.

After much work in planning, purchasing and deployment, the Farallon Cam was turned up a couple of weeks ago.

It hasn’t been smooth. Some of the problems encountered have been links failing due to interference or hardware failure. This has caused the stream to be down more than we wanted to, but it did show for a small budget, that consumer grade unlicensed radios can provide decent bandwidth tens of km to provide the infrastructure for applications like streaming video, voice, data, etc.

Starbucks Gold Card..

Months fly by and I get back into the mode of throwing out pointers and thoughts again.

I came across a nice little deal in the last couple of weeks; the Starbucks Gold Card. For $25 a year you get 10% off of over-priced purchases but what hooked me was the fact you also get 2 hours a day of free WiFi at a Starbucks ATT Hotspot. Nice little deal when ATT wants to ding you for $6 to $10 for some part or whole of a day. Just 4 or 5 two-hour visits pays this thing off. The pricing works well if you are a light user of these spots. If I was more of a road warrior then I would look more at Boingo. As someone that is looking for a place to do some work between meetings, it fits the bill.

Uncovering history through National Coast and Geodetic Survey benchmarks

Gads it has been a while. Between conferences, work and play, I let this site languish a bit. Well, here is a cool thing to do and not a rant.

In walking around, do you ever look down and see a small circular brass marker that are embedded into rock, concrete or even buildings. They were placed by the National Coast and Geodetic Survey and are survey benchmarks.

A good description of these benchmarks can be found at:

Many of these were surveyed and planted there at the turn of the last century. As there has been some time elapse since most of these were established, there is some interesting history that can be gotten by looking up these sites and reviewing the data sheets for each marker. Fortunately, the National Geodetic Survey (NGS) has this data on line. By going to:

and entering your longitude and latitude and a radius of the area you want, you can get nearly all of the markers. I say “nearly” as I have found markers that are not in the database. Once you get the listing of markers in your area, tell the page to get the data sheets for the markers. Each data sheet is a bit messy to ready and you can get some guidance on how to go through them from the “peakbagging.com” URL above.

Once you have the data sheets, try to find the markers. It is much like geocaching in tracking it down, ‘cept you don’t leave or take anything once you find the spot.

What has been interesting to me when I have gone looking for the markers, is the history of the site during the time the markers have been established. Looking at the data sheets you can see the descriptions of the site at the time it was established and when they revisited the site every 30 years or so. You will find the changes in the area documented in these sheets. For instance, living in San Francisco, I was interested in seeing markers in my area. What I found was a establishment of a long forgotten overseas radio station near my house on Ocean Beach or be able to review how South of Market had changed from a massive Southern Pacific rail yard to a trendy geek enclave.

Homebrew GSM with OpenBTS

Unlike most of San Francisco who was at Burning Man, I was out this last week at the Strawberry Music Festival so there was another gap between postings. There are a couple of interesting stories from Strawberry I will get into later. But for now…

I was having lunch with John Gilmore who was freshly back from Burning Man. Stories abound the festival but one that I zeroed into was the experiment of one of the camps at Burning Man to build a GSM celluar service on a Software Defined Radio (SDR). John was certainly tracking this as he help start gnuradio which is an open-source SDR package that is currently lead by Eric Blossom. Matt Ettus is a another notable as he had created the default hardware platform that Gnuradio runs on.

SDRs are radios where the hardware is designed to be very general purpose so as to receive and/or transmit many different types of modulation and frequencies. The software for SDRs do all the work of setting the frequencies and what to do with the received signals. Hence the name of Software Defined Radio where the software models a hardware design that would create a building block such as single-sideband modulation or a quadrature FM detector. As such, a critical component in an SDR is an Analog to Digital (ADC) or a Digital to Analog Converter (DAC) where the software will do all the math in the digital domain of the analog wireless signal.

What was interesting to me with this deployment was the fact that OpenBTS project built an SDR radio that would transmit and receive on cellular GSM frequencies and become a base station for GSM cell phones. Not only that, they took an open source phone switch called Asterisk and, using VoIP and the Internet connection at Black Rock City, provide cell service to the public switched telephone network PSTN.

Providing service at Burning Man was a stroke of genius as the OpenBTS folks had a great test bench of almost no existing cell coverage there and thousands of attendees that have their cell phones turned on. In fact, this test site was a bit too successful as every GSM phone tried to associate with their setup in almost an inadvertent denial of service attack. Their rig, had a hard time keeping up with the cell phone requests. After they resolved this issue, they were experimentally passing phone calls from Black Rock City off to real telephone numbers around the world.

Now, this is all fun playing with Asterisk and OpenBTS in the middle of North America where wireless phone service is taken for granted. For instance, in the United States we usually have at least two or three base wireless phone carriers providing service to an area. Although these deployments are ubiquitous in first world countries, the current cell phone tower installation is expensive and financially prohibited in deploying in third world nations or even many lightly populated areas in the US. Using Asterisk that easily replaces a 10s or 100s of thousand dollar phone switch help reduce the cost of deploying wireless cell service. In the case of the Burning Man deployment, OpenBTS was able to put a rather effective Cell site for under $5,000. With this low cost of deployment and the combination of OpenBTS and Asterisk vast areas of the earth that are not covered by phone service now could be covered by open source cell phone deployments.