Thursday, July 07, 2011

Building a High Pressure Solar Powered Rainwater Irrigation System

Water is free when it's rain, so is sunlight. Why not put the two together and build a sustainable irrigation system - here's how I got it going.

In the early spring of 2011 my wife started a new garden at our house in North Carolina. Once all the tilling and planting was done, we needed a watering system (aka drag a hose to the end of the lawn). Seeing a bump in our water bill and running hoses back and forth got a little old, so I wanted to design a sustainable irrigation system. To the drawing board and time for a project!

The goals of the project were pretty straight forward.
  • Store and reuse rainwater runoff from the house
  • Had to be easy to use (ie. pull the handle on the hose and it sprays)
  • High pressure to service 200+ feet of hose and/or underground pipe
  • Go completely green and leverage solar power to run the pump system - why not at this point
Why you ask? Well because everyone needs a backup plan for irrigation during a complete power and city water service interruption! Just kidding, this project started by looking into renewable water resources and adding the solar power just seemed to fit better with the overall theme it took on - plus it turned out pretty cool, it's off the grid and all completely renewable. [Queue "Go Green" chant here]. This could easily be adapted for remote locations etc..

First I set off looking at how other people had accomplished rainwater systems. I found that many used reclaimed 55 gallon barrels to store rainwater. Although this is a great idea, I needed more storage space so I quickly started looking at larger containers. I went with a 400 gallon polypropylene tank that I picked up locally from a farmer that used it for water only. Before running across the tank I considered using standard IBC totes, which allot of people seem to be doing with good success - I recommend finding some of those totes if you cant get your hands on a good tank.

Rainwater quick math..

45 minutes of decent rain fills my 400 gallon tank so don't think you wont have enough water to fill the tank you choose. More specifically 1 inch of rain on a 1,000 square feet roof yields 623 gallons. The one consideration I will also mention is factor in how much your tank will weigh full. Water weighs 8.35 lbs per gallon, so at 400 gallons full my tank weighs in at a hefty 3340 lbs (or as much as a typical mid sized car). I would NOT suggest setting that on your deck, support its foundation well.

Now that I had a tank, mounted in position under a good gutter downspout - I started looking at pumps.

Many small residential rainwater systems rely on gravity to feed a small spigot. As you can see on the left I incorporated that also but I needed high pressure across a long distance so a good pump was necessary. Unbeknownst to me when I started this endeavor pumps are apparently science into themselves, and before this I was completely uninitiated. After speaking with several pump vendors, finishing the internet (yep, the entire internet), and an honorary degree in hydrodynamics I decided on a 12v on demand diaphragm pump. Basically what this means is when the pump detects a drop in pressure on the hose side (sprayer nozzle open) it starts, when it builds up pressure (sprayer nozzle closed) it stops. 12v because I would be running it off a deep cycle battery charged by solar and on demand so I wouldn't have to explain to anyone how not to burn up a pump.

I ended up purchasing a Delevan 7870 model pump which was
probably a little overkill. Its pumping 7 GPM @ 60 PSI - it works great.

To power the system I picked up a deep cycle marine battery, 45 watt solar panel and regulator. I got a great deal on the panels from Harbor Freight, so far so good and charges well even in not optimal sun conditions. I wired the regulator into my sunroom where I added a small 300watt A/C inverter to also use the free solar power for a lamp or radio when I'm hanging out on the porch (an outlet on the porch was something the builder apparently forgot but that's another story). The regulator obviously also feeds the marine battery keeping it fully charged for the pump. All these electronics I stuffed in a little cabinet to keep it dry and monitor the charge etc. I mounted a power switch on the battery box to start up the pump and wired it all up.

The pump wasn't designed to be doused with rain and the
elements so I mounted it into another battery box. According to the manufacturer it also had some tendencies to overheat so I wired in a small 12v fan from my computer grave yard parts bin. It seems to keep it cool enough for the service its done so far.

A couple more notes of what I found through talking with people and trial and error. Be sure to have a good hose or pipe on the output side of the pump. Your spray nozzles cant be too restrictive or they will cause the pump to get all the way to 60psi (or whatever the cut off point of your pump is) and then cut off. You will know you have too much restriction if you spray the hose and the pump cycles on and off continuously - this is bad and will kill the pump in short order. The idea is for the pump to stay on continuously when the trigger on the hose is pulled or the sprinkler is running

In lieu of a first flush system that diverts the first 10-20 gallons of water into a reserve tank (for cleaner water) I grabbed a small skimmer basket from my local pool supply store, a tiny one at around 6" across that fit perfectly into the tank opening. I then put a filter sock in the skimmer basket (pool store had those also). So far pretty good filtration for all but the finest particulates and is really easy to pop out and clean. I may eventually build a first flush system however.

For the garden I ran about a 220ft trench and 3/4" PEX PVC piping. This is connected to a bib that runs a 3/4" commercial sprinkler head. There was plenty of flow and not enough restriction to cycle the pump on and off - perfect. (pic to the right is finishing up the trench to bury the pipe)

For the regular hose I have 200ft of 3/4" hose with a run of the mill sprayer nozzle. I did have to try several before I found one that had enough flow, therefore keeping the pump running. Remember an on demand pump cycling on and off will become a boat anchor very soon. (to reference I use either the hose or the pipe to the sprinkler, not both at the same time)

Some take aways from using my system:
  • 3/4" sprinkler head throws water 40' (A=π *r² )- so that's irrigating over 5000sq ft off a solar charged pump!
  • Tank fills in about 45mins of decent rain
  • Deep cycle battery has never completely drained with solar and current usage (yet)
  • While running commercial sprinkler, tank can be completely drained in about an hour
Some adds I am considering:
  • First flush diversion system for cleaner water
  • Float switch to turn off pump when water is low
  • Check valves and add line from house so I can switch sprinkler to city water if no rain
  • Secondary tank under porch
Although this is a high level overview of the entire design and build hopefully it will help you with your own sustainable solar powered rainwater system, good luck and have fun building something.

If you read this far,  you should follow me on Twitter!

Saturday, July 03, 2010

The Oryan Project: from the garage to near space

Update: I created a subsite for this project as there's way too much information and images for a blog post, check out our website for this project at

Well, It's been quite a while since an update. We've been working on a number of things but probably most interesting was/is Project Oryan (Astrohack). The project was a self induced challenge to launch a platform into the stratosphere and take some amazing pictures of the little ball in which we live, all from commercially available parts and some garage space.

To date we've had many successful launches and have developed a stable repeatable launch process and platform. We've made it 100k feet several times and captured some amazing shots and data. We were surprised at all the attention it receives and have been covered by local and national media. We've also been fortunate to attend youth science activities to show our crafts and discuss the challenges of near space photography.

The pictures below are from our second try. We lost communication with Oryan I on decent, but I'm happy to say Oryan II which was launched on July 1, 2010 was recovered successfully and worked flawlessly. Expect a better write up soon, but for now take a look at a couple of the pictures we got...

If you read this far,  you should follow me on Twitter!

Predicted Trajectory:
Panoramic from 100 or so of the pics stitched together:
A couple of the over 2200 pics we got from near space:

Tuesday, June 23, 2009

The Great Tech Organization and the Digital Split

Ever since private companies and government started using computers there has been someone making the decisions on what technologies to leverage. In the beginning options were limited so choices were easy (albeit expensive). Entities adopt methods of sharing information and for better or worse stick with the plan over the course of a decade or longer. As fast paced as the industry is, there have only been a few leaps that have changed the game entirely but they all add up to where technology fits in today's workplace (and for that matter where it's going).

1. Cheap Computers: Once computers got cheap enough, they became ubiquitous.
2. The Internet: Once they were all connected, people could share information (chaotically at best)
3. The Great Organization: This is where we are now, web 2.0. Open access to information, clouds form

Numbers one and two are past tense so let's focus on three. The Great Tech Organization, or so I call it. Allot of people refer to it as web 2.0, 3.0, cloud or Generation Y computing. Whatever nomenclature you tag it with the idea is the same, to untangle the lack of standards from the Internet boom and really start using information efficiently by making access platform transparent (my name is still the best, however).

"The Great Tech Organization" made possible incredibly powerful applications that we all use (ok, unless you are over 35 - we'll get to that later) daily. Interconnected (via standards) cloud applications and methods like tagging, Facebook, Google Apps, YouTube, Twitter etc are all built in a way to share the information they contain freely at will with anyone, or any application that chooses to access it. This did a number of things but mainly it allowed anyone from the best application engineers down to shade tree developers to tap into incredibly powerful specialized central systems to enhance their own applications while allowing the people to decide what data they need. So... That's why there is a embedded youtube videos on every random website out there.. hmmm. Thats why I can view what 200 of my friends have on their mind right now, from my Blackberry - and react if I want to. Let the people have the data and they will figure out what to do with it, that's the idea here. Once all these systems had a way to communicate, other than via browser (which required a human), the internet starts to become a less chaotic, more effiecient, and more organized place to live.

So here we are in 2009 with all of these great applications at our disposal on demand. At no other time could the average Joe access so much information so quickly. So we use all this information socially, is big business taking note?

The Digital Split in today's businesses: There was a time that one generation of people who didnt grow up with computers disregarded them, computers were almost entirely embraced by the youth but not their parents. That's what I call the "Analog Split". Today most everyone uses a computer to communicate in one way or another and for the most part people have adopted the internet as here to stay. What we see now is the Digital Split, where one generation of business is used to doing things the old fashion way (centralized in house servers, email messaging, custom specialized applications) and the current/future way (cloud computing, social networking, and web 2.0 applications).

So who is adopting the new methods? Which side of the Digital Split is your company?

Social network links on the official White House website

Let's look at the feds. The US Government is actually doing a great job in my opinion, with at least part of the solution, and much better than many large companies (yep, I said it). This is largely due to allot of youthful influence on new policy when it comes to IT and a new administration willing to roll the dice. They are starting by communicating with the masses. The president spells this out as the new government vision in a January 2009 briefing from the White House. You'll notice the White House has it's own YouTube channel and Facebook page (as does the State Dept, and so on and so on..). Obama pushes twitter updates constantly. It's not 100% effective yet, or nearly close, but you can see that they take it seriously. Take a look at the project and you will see the feds are also putting open access methods into practice also (someone up there is on the right side of the split). It's not just a way for the country to promote its agenda (it is) but it's also the new way of doing business. The people have adopted these methods to communicate personally, why shouldnt business? I say they should, if you want to remain competitive.

So take note of which side of this digital split your company is on. The lines between personal and professional computing are blurring by the day. The smart money rightfully recognizes the power this brings. Skills that will be in demand in the future may revolve around YouTube, Facebook, or anything else your boss doesn't want you using while at "work". For the first time since the internet, the people are ahead of business in way they think about sharing information. Besides, 1 million heads are better than, say.. one - right?

If you read this far,  you should follow me on Twitter!

Saturday, January 10, 2009

Weathering a "Reply to All" storm

March 2011 Update: Had a great conversation with The Wall Street Journal about reply all storms - Check out the article and graphic based on our interview ("The Perfect Email Storm") on the WSJ site here.
If you work for a large organization, chances are you have seen a "reply to all" email storm. It starts out like this, someone sends an email to a distribution list which contains every email address in the company(thousands of addresses). One person clicks "reply to all", and says something like "please remove me from this discussion". Well, this is the first of three phases of an email storm. I'll call it the calm before the storm, to make it more exciting... If you're keeping count we've got two emails out to everyone in the company.

Phase two. Many of the recipients of the email think, "hey, me too, this email has nothing to do with my job or what's on woot today so I want off this email chain too". So about 5% of these people click "reply to all" and send a message indicating they want off too. Now we're up to 70 or 80 emails to everyone in the company.

Phase Three of the storm commences. Phase three is when another 5% of the recipients get tired of the unsolicited flood of emails so they, yes, REPLY TO ALL to inform everyone to stop replying to all. It makes me tired just typing it, but it happens and it gets even worse. The replies start nice, of course everyone wants to help. Then people start getting angry, and, yes, reply to all to tell them about it.

Now this all sounds a bit silly, but I have seen this happen twice in the last two years at two different (large) organizations. The first time at "company A" was bad, but thankfully it was limited to a small (in comparison) distro list of about 1500 people. The second and most recent instance of the phenomena was a lot worse and the initial email went out to about 25,000 recipients. The result of the second example was tormented exchange servers that couldn't handle the load and inevitably shut down email for the organization, globally. (yes, there were some CHOICE emails in the flood that were quite funny and I assume more than one person was canned for their replies) Needless to say there are a lot of embarrassed people at the site right now for the self inflicted email crash.

So what do you do. First off, don't reply to all to tell everyone not to reply to all. Even if you haven't done one full hour of actual work all year and you KNOW that if you tell everyone to stop that will save the day, everyone will clap, and you will get a raise. That wont happen, it just wont. So don't click it!

The second thing is if you send an email to a large distribution list, put the distro list address in the bcc line. Then, in the first line of the email indicate the name of the list the email was sent to so all the recipients know. If a recipient replies to all on a message you sent to a bcc, it will only go the the sender and not what was in the bcc.

Another novel way, if again you are the sender, is to block the reply all button for your recipients. Now, granted, this will only work for Outlook users using Exchange within an organization, but if this is a good match then this will actually remove the button from the recipients mail for your message. Pretty cool! Here's how to do it:

Add the following macro to your Outlook(2003).

Sub NoReplyAll()
Dim myolapp As Object
Dim myinspector As Object
Set myolapp = CreateObject("Outlook.Application")
Set myinspector = myolapp.ActiveInspector
myinspector.CurrentItem.Actions("Reply to All").Enabled = False
myinspector.CurrentItem.Actions("Forward").Enabled = False
End sub

After creating the macro, you can create a button in your message window to run the macro when you create a new message. Running the macro prior to sending the message changes the metadata that exchange reads to disallow the “Forward” and “Reply to All” buttons for everyone receiving the message within the same organization and using Outlook.

Anyhow, for what it’s worth, if you are sending email to a large distribution list or know someone that frequently does – this may be helpful in stopping a storm before it starts.

If you read this far,  you should follow me on Twitter!

Tuesday, August 26, 2008

Version 2.0, Release 2 is here! (Ryker's new brother)

I've been offline AGAIN anxiously awaiting the arrival of our second son, we're very excited he is here! Cade William Butcher was born on August 24th, 2008 weighing 7 lbs 3 oz and 19 inches. We're very excited about the new addition to our family and development staff (in a few years maybe)!!

Friday, February 22, 2008

3DTelemetry beta is here!

In a previous post I eluded to a project I was working on involving GPS and OBDII vehicle data. The beta is now online and ready for download here. All you need is an NMEA compatible GPS device or logger, and an OBDII scanner if you want to import vehicle data (3DTelemetry will create maps without OBDII also).

3Dtelemetry takes data logged in your favorite OBDII scan tool (growing compatibility list here) and merges it with the GPS data that it (or an external GPS logger) aquires along your drive. 3DTelemetry then exports the data into KML format for viewing within Google Earth. So, essentially you will be able to see on a 3D map that your Mass Air Flow pressure was 2lbs in corner 4 and the engine reported 6400RPM while you were passing grandma's house.

This is obviously going to be an ongoing project and I will update my blog with any dramatic changes, but check the site ( for the latest information. Debugging from the passenger seat has presented an entirely different programming experience, but this one has been allot of fun to create so far. Special thanks to Jay and the gang at Autoenginuity and others for fielding my incessant questions about OBDII.

I intend to keep the application free during this initial beta phase and then very affordable once we have a valid release. I think there are too many high priced doodads out there for the amateur racer these days. So if you enjoy your car and you're looking for something useful and cool that wont break the bank check it out!

Please feel free to drop me a line with any suggestions or comments!

Tuesday, December 04, 2007

Making sense of the Global Positioning System

Thanks to the U.S. Department of Defense (and good ole President Ronald Reagan) GPS signals are freely available for civilian use. The fact is, today GPS is basically ubiquitous in most people's lives. Most new cars use it to show you where you are and your proximity to the nearest Starbucks. Raising the "cool factor" bar for technology using GPS is its implications on the Internet as we know it. For example people are geocoding their images in their online photo albums, Cool apps like Google Earth, Geocaching and all sorts of new creative games using real places are sprouting up all over the Internet (people are going outside again!). Basically, with GPS the Internet can break out of it's closed linear stage and becomes part of our real three dimensional world. Very cool, but more on that later - let's dive into what makes the Global Positioning System tick and how you can take advantage of the technology in your next project.

You will find that most GPS devices out there (USB GPS for your PC, handheld GPS's, etc etc) will report data from satellites in a neat standard format called NMEA. NMEA uses a serial ASCII protocol to send GPS data to your application for consumption. The resulting comma delimited data is piped out or logged sentence by sentence from your device. The NMEA format makes it easy to take the data and parse it anyway we'd like. Now we could just use the software that came with our Microsoft's Streets and Trips or Earthmate GPS but that's no fun.

Decoding GPS log files
There are many different types of NMEA sentences. Luckily they are all in an easy to read format. The first field is the sentence type and will start with $. I am only going to focus on a couple sentence types that give us our position information. For the purpose of this blog, we'll ignore the other sentences but there is plenty of references online about them.

My favorites, $GPGGA and $GPRMC. You will find that these two will have all the data you should need for tracking.

Example sentence (GPGGA):


Here is what each field means:
1 = UTC of Position
2 = Latitude
3 = N or S
4 = Longitude
5 = E or W
6 = GPS quality indicator (0=invalid; 1=GPS fix; 2=Diff. GPS fix)
7 = Number of satellites in use [not those in view]
8 = Horizontal dilution of position
9 = Antenna altitude above/below mean sea level (geoid)
10 = Meters (Antenna height unit)
11 = Geoidal separation (Diff. between WGS-84 earth ellipsoid and
mean sea level. -=geoid is below WGS-84 ellipsoid)
12 = Meters (Units of geoidal separation)
13 = Age in seconds since last update from diff. reference station
14 = Diff. reference station ID#
15 = Checksum

Example sentence (GPRMC):


Here is what each field means:

1 = UTC time of fix
2 = Data status (A=Valid position, V=navigation receiver warning)
3 = Latitude of fix
4 = N or S of longitude
5 = Longitude of fix
6 = E or W of longitude
7 = Speed over ground in knots
8 = Track made good in degrees True
9 = UTC date of fix
10 = Magnetic variation degrees (Easterly var. subtracts from true course)
11 = E or W of magnetic variation
12 = Mode indicator, (A=Autonomous, D=Differential, E=Estimated, N=Data not valid)
13 = Checksum

By parsing these etypes either in real time by reading the data from a COM port, or from an existing log file you can use the coordinate information in any way you choose. I am currently working on a cool (Ok, I think its cool..) vehicle tracking application that I should be releasing here on this site soon. There are allot of good examples floating around with source code that should give you a handle on using this data.

In closing, the Global Positioning System is a powerful resource that you can tap into and enable your applications to become spatially aware. This opens plenty of doors for new and innovative apps - so get coding!

If you read this far,  you should follow me on Twitter!