Raspberry Pi – cleverbot voice communication

Using my first generation Raspberry Pi and a few USB / analog devices, i’ve been able to create (a rather slow) cleverbot voice communicator.

The reason for the slow down is initialization and listening on the USB microphone,
but other than that everything works as expected.

#!/usr/bin/env python

import speech_recognition as sr
import pyttsx
import cleverbot

print 'Initializing, please wait...'

# define our cleverbot
cb = cleverbot.Cleverbot()

# speech recognizer setup
r = sr.Recognizer()

# engine for text to speech
engine = pyttsx.init()
#engine.setProperty('rate', 20)


while True:

    # obtain audio from the microphone
    # this is the bit of code that take a long time...
    with sr.Microphone() as source:
	print("Talk to cleverbot!")
	audio = r.listen(source)

    phrase = r.recognize_google(audio)
    print '  me: %s' % phrase
    resp = cb.ask(phrase)
    print '  cleverbot: %s' % resp

    engine.say(resp)
    engine.runAndWait()

Share on Facebook Share on Twitter Share on Google+

Janky Lego stop motion

Well the kids have lost interest in Raspberry Pi Python programming for now, but look who’s still at it! The jankyiest of Lego stop motions.

stop_motion

 

Here was the code I tossed together to make the gif above:

#!/usr/bin/env python2

import os
import time
import shutil
import datetime
import tempfile
import pygame.camera
import pygame.image
import RPi.GPIO as GPIO

save_dir = '/usr/share/nginx/www'

GPIO.setmode(GPIO.BCM)
GPIO.setwarnings(False)
GPIO.cleanup()
GPIO.setup(17, GPIO.IN)

pygame.camera.init()
camera = pygame.camera.Camera('/dev/video0')

def make_picture(filename):
    raw_input('Ready for picture? ')
    camera.start()
    image = camera.get_image()
    pygame.image.save(image, filename)
    camera.stop()

def make_gif(frames=5):
    print 'Making you a gif using %s frames, get ready!' % frames
    time.sleep(0.5)
    dir = tempfile.mkdtemp()
    for i in range(frames):
        print 'Taking picture!'
        make_picture('%s/%s.jpg' % (dir, i))
        time.sleep(3)

    print 'Converting images to gif, please wait...'
    os.system('convert -delay 20 %s/*.jpg %s/animated.gif' % (dir, dir))

    filename = '%s.gif' % datetime.datetime.now().isoformat()
    shutil.move('%s/animated.gif' % dir, '%s/%s' % (save_dir, filename))
    shutil.rmtree(dir)
    print 'Complete!'

while True:
  if GPIO.input(17):
      make_gif()

GPIO.cleanup()

And a picture of the rig:

Screen Shot 2015-12-21 at 6.42.50 PM


Share on Facebook Share on Twitter Share on Google+

GeoDjango and Taco Bell

I’ve been at it again with GeoDjango, this time I’ve pulled data on all Taco Bells locations from a popular social media site, took that data and added to a Django project, and finally plotted
them in a view using Google Maps:

Screen Shot 2015-11-22 at 8.13.51 AM

 

 

 

 

 

 

 

 

Wow, that is a lot of Taco Bells!

Since this is Django, we are also able to view and edit from the admin:

Screen Shot 2015-11-22 at 8.15.13 AM

As well as the shell:

Screen Shot 2015-11-22 at 8.15.47 AM
django-cities was used tie it all together, which allows me to do searches like how many Taco Bells do the cities with the highest population have:

In [1]: from cities.models import City

In [2]: from hub.models import Location

In [3]: for city in City.objects.order_by('-population')[:20]:
    locations = Location.objects.filter(cities_city=city)
    print '%s with population %s has %s Taco Bells' % (
      city.name, city.population, len(locations))
   ...:
New York City with population 8175133 has 0 Taco Bells
Los Angeles with population 3792621 has 34 Taco Bells
Chicago with population 2695598 has 15 Taco Bells
Brooklyn with population 2300664 has 5 Taco Bells
Borough of Queens with population 2272771 has 0 Taco Bells
Houston with population 2099451 has 54 Taco Bells
Philadelphia with population 1526006 has 13 Taco Bells
Manhattan with population 1487536 has 1 Taco Bells
Phoenix with population 1445632 has 34 Taco Bells
The Bronx with population 1385108 has 0 Taco Bells
San Antonio with population 1327407 has 22 Taco Bells
San Diego with population 1307402 has 22 Taco Bells
Dallas with population 1197816 has 27 Taco Bells
San Jose with population 945942 has 18 Taco Bells
Indianapolis with population 829718 has 30 Taco Bells
Jacksonville with population 821784 has 18 Taco Bells
San Francisco with population 805235 has 11 Taco Bells
Austin with population 790390 has 25 Taco Bells
Columbus with population 787033 has 23 Taco Bells
Fort Worth with population 741206 has 16 Taco Bells

Or how may Taco Bells each state in the United State has:

In [1]: from cities.models import Region

In [2]: from hub.models import Location

In [3]: for region in Region.objects.all()[:10]:
    locations = Location.objects.filter(cities_state=region)
    print '%s has %s Taco Bells' % (region.name, len(locations))
   ...:
Arkansas has 58 Taco Bells
Washington, D.C. has 5 Taco Bells
Delaware has 14 Taco Bells
Florida has 363 Taco Bells
Georgia has 193 Taco Bells
Kansas has 65 Taco Bells
Louisiana has 92 Taco Bells
Maryland has 91 Taco Bells
Missouri has 157 Taco Bells
Mississippi has 48 Taco Bells

Share on Facebook Share on Twitter Share on Google+

Using GeoDjango to filter by Points

Just recently I found myself playing with GeoDjango, I’ve been using it on both a Ubuntu 14.04 cloud server and a Macbook Pro (OS X El Capitan).

GeoDjango allows us to query by geographic points directly on the data model.
We are then able to extend the model, and add a custom method to search by zipcode.

Using the Django shell we can easily check data in our favorite interpreter :

$ ./manage.py shell

In [1]: from hub.models import Vendor

In [2]: Vendor.get_vendors(zipcode='78664', miles=5)
Out[2]: [<Vendor: Starbucks>]

In [3]: Vendor.get_vendors(zipcode='78664', miles=10)
Out[3]: [<Vendor: Starbucks>, <Vendor: Starbucks>, 
<Vendor: Starbucks>, <Vendor: Starbucks>, 
<Vendor: Starbucks>, <Vendor: Starbucks>, <Vendor: Starbucks>]

It’s then pretty easy to take that data and present it on a Google Map
(using the Django application’s views and templates):

Screen Shot 2015-11-17 at 12.39.06 PM

 

 

 

 

 

 

 

If you find any of this exciting; read on, I’m going to go over setting the
environment up from scratch (using a Macbook as the development environment).

 Prerequisites

Setup

It is a good idea to add Postgres.app‘s bin path to your $PATH.

You should run the following command (changing the version to match your install),
and add it to the bottom of your ~/.bash_profile:

export PATH=$PATH:/Applications/Postgres.app/Contents/Versions/9.4/bin

Next lets create our PostgreSQL database, and enable the GIS extension.

Start the Postgres.app OSX application. Next click the elephant from your upper task bar, and select Open psql.

jeffreyness=# create database geoapp;
CREATE DATABASE

jeffreyness=# \c geoapp
You are now connected to database "geoapp" as user "jeffreyness".

geoapp=# CREATE EXTENSION postgis;
CREATE EXTENSION

You can now close the psql shell.

Next lets install Django into a virtualenv

# create and change to new app directory
mkdir ~/geoapp && cd ~/geoapp/

# create a fresh virtual environment
virtualenv env

# activate the virtual environment
source env/bin/activate

# install Django inside the virtual environment
pip install Django

To use PostgreSQL with Python we will need the adapter installed,
be sure you added Postgres.app’s bin path to your $PATH:

pip install psycopg2

GeoDjango requires the geos server to be available, we can install this with homebrew:

brew install geos

We are now ready to create the Django project and application.

# create a new project using Django admin tool
django-admin startproject geoproject

# change to the newly created project directory
cd geoproject/

# create a new application
./manage.py startapp hub

Now you need to configure your Django application to use PostgreSQL and GIS,
open geoproject/settings.py with your favorite text editor.

vim geoproject/settings.py

Append django.contrib.gis and hub to your INSTALLED_APPS:

INSTALLED_APPS = (
    'django.contrib.admin',
    'django.contrib.auth',
    'django.contrib.contenttypes',
    'django.contrib.sessions',
    'django.contrib.messages',
    'django.contrib.staticfiles',
    'django.contrib.gis',
    'hub',
)

Next find the DATABASES portion and set it to the postgis engine:

DATABASES = {
    'default': {
        'ENGINE': 'django.contrib.gis.db.backends.postgis',
        'NAME': 'geoapp',
        'PASSWORD': '',
        'HOST': 'localhost',
        'PORT': ''
    }
}

The next step will be to create our model using GIS points,
add the following to hub/models.py:

from django.contrib.gis.db import models
from django.contrib.gis.geos import Point, fromstr
from django.contrib.gis.measure import D

class Vendor(models.Model):

    def __unicode__(self):
        return unicode(self.name)

    def save(self, *args, **kwargs):
        if self.latitude and self.longitude:
            self.location = Point(float(self.longitude), float(self.latitude))
        super(Vendor, self).save(*args, **kwargs)

    name = models.CharField(max_length=100)
    longitude = models.FloatField()
    latitude = models.FloatField()
    location = models.PointField(blank=True, null=True)

You will also want to add this model to the admin page, so update hub/admin.py:

from django.contrib import admin

from hub.models import Vendor

class VendorAdmin(admin.ModelAdmin):
    list_display = ('name', 'longitude', 'latitude')
    exclude = ('location',)
admin.site.register(Vendor, VendorAdmin)

At this point you are ready to create the database tables, use the provided manage.py script:

./manage.py syncdb

I’m going to now jump into the Django shell to add data, but this can also be done using the admin (http://127.0.0.1:8000/admin):

./manage.py shell

In [1]: from hub.models import Vendor

In [2]: Vendor.objects.create(longitude=-97.677580, latitude=30.483176,
   ...: name='Starbucks')
Out[2]: <Vendor: Starbucks>

In [3]: Vendor.objects.create(longitude=-97.709085, latitude=30.518423,
  ...: name='Starbucks')
Out[3]: <Vendor: Starbucks>

In [4]: Vendor.objects.create(longitude=-97.658976, latitude=30.481517, 
   ...: name='Starbucks')
Out[4]: <Vendor: Starbucks>

In [5]: Vendor.objects.create(longitude=-97.654141, latitude=30.494810,
   ...: name='Starbucks')
Out[5]: <Vendor: Starbucks>

I can then define a point in the center of the city, and filter by locations within a 5 mile radius:

In [6]: from django.contrib.gis.geos import fromstr

In [7]: from django.contrib.gis.measure import D

In [8]: point = fromstr('POINT(-97.6786111 30.5080556)')

In [9]: Vendor.objects.filter(location__distance_lte=(point, D(mi=5)))
Out[9]: [<Vendor: Starbucks>, <Vendor: Starbucks>, <Vendor: Starbucks>, 
<Vendor: Starbucks>]

Hope you found this article helpful; if you did, please share with friends and coworkers.


Share on Facebook Share on Twitter Share on Google+

Simple EC2 Instance + Route53 DNS

If you have a multi-environment AWS setup, and want a easy way to resolve all EC2 instance using Route53 DNS, look no further!

Currently I’m maintaining a production and staging environment on Amazon Web Services across multiple regions. We tend to not use ElasticIPs as that just increases cost, plus internally we resolve using Consul. There is one drawback with not using ElasticIPs, when ever the instance restarts they will be offered a new dynamic IP (we will solve this with automation).

Our EC2 instances are deployed using Saltstack and salt-cloud, so adding to our base SLS made sense, below is a snippet of the states

salt/base.sls

include:
 - cli53

#
# Update AWS Route53 with our hostname
#

/opt/update_route53.sh:
 file.managed:
 - source: salt://base/templates/update_route53.sh
 - mode: 775

update_route53:
 cmd.run:
 - name: /opt/update_route53.sh update {{ pillar['environment'] }}
 - unless: /opt/update_route53.sh check {{ pillar['environment'] }}
 - require:
  - pip: cli53
  - file: /opt/update_route53.sh

This state places a new script at /opt/update_route53.sh, then below it runs the update unless the check shows no change. This script requires cli53 so we have another SLS that handles that install.

The script is merely a bash shell script with a case statement:

/opt/update_route53.sh

#!/bin/bash

#
# Simple script for updating Route53 with instance IPs
#
# How to get public ip for EC2 instance
# http://stackoverflow.com/a/7536318/4635050
#

ENVIORNMENT=$2 # either prod or dev
HOSTNAME=$(hostname)
PUBLIC_IP=$(curl -s http://instance-data/latest/meta-data/public-ipv4)
DNS_IP=$(dig $HOSTNAME.$ENVIORNMENT.youdomain.com +short)

case "$1" in

 check)
  if [[ "$DNS_IP" == "" ]] ; then
   exit 1
  elif [[ "$PUBLIC_IP" != "$DNS_IP" ]] ; then
   exit 1
  fi
  exit 0
  ;;

 update)
  if [[ "$DNS_IP" == "" ]] ; then
   echo "Did not find record for $HOSTNAME.$ENVIORNMENT, Creating.."
   cli53 rrcreate yourdomain.com $HOSTNAME.$ENVIORNMENT A $PUBLIC_IP
  elif [[ "$PUBLIC_IP" != "$DNS_IP" ]] ; then
   echo "Found IP $DNS_IP for $HOSTNAME.$ENVIORNMENT, Updating to $PUBLIC_IP"
   cli53 rrdelete yourdomain.com $HOSTNAME.$ENVIORNMENT
   sleep 30 # give AWS some time to delete
   cli53 rrcreate yourdomain.com $HOSTNAME.$ENVIORNMENT A $PUBLIC_IP
  else
   echo "No need to update. passing.."
  fi
  ;;
esac

Assuming you have set your NS records to Route53 for your domain, and the salt
state or script has been ran, you should be able to resolve your instances like below:

$ dig salt-master.prod.yourdomain.com +short
1.2.3.4
$ dig webapp.dev.yourdomain.com +short
4.3.2.1

Happy hacking!


Share on Facebook Share on Twitter Share on Google+