Process Elasticsearch JSON on the shell

Lets throw security out the window for a moment. Say we store user accounts with clear text passwords in Elasticsearch , what is the easiest way to use the results in a shell script? We can begin by creating two accounts, one for admin and one for john : # curl -XPUT localhost:9200/site/people/1?pretty=True -d ' {"name": "admin", "password": "secret", "admin": "true"} ' { "_index" : "site", "_type" : "people", "_id" : "1", "_version" : 1, "_shards" : { "total" : 2, "successful" : 1, "failed" : 0 }, "created" : true } # curl -XPUT localhost:9200/site/people/2?

Telegraf laptop battery plugin

Wanted to expand a little on my previous blog post Custom Telegraf Plugin , and decided to do a simple battery monitor. The end result looks something like this: I decided to read from the file /sys/class/power_supply/BAT0/capacity on my Ubuntu 14.04 machine, this file merely shows the current battery percent: # cat /sys/class/power_supply/BAT0/capacity 62 All that is needed is a little Python script for converting this output to JSON, my script outputs like this:

Custom Telegraf Plugin

I just started looking to InfluxDB and Telegraf for collecting data on a Linux machine, then visualizing it with Grafana . I’ve historically used collectd , statsite , and graphite to accomplish the same sort of task, but wanted to see how some of the new software compares. I’m running a Ubuntu 14.04 LTS virtual machine, so feel free to follow along. I managed to install the packages from the InfluxDB ubuntu repositories :

PostgreSQL vacuum script

PostgreSQL does have a built in auto vacuum , but sometimes you just want a small script that can be ran through Jenkins to perform the vacuum for you. Wanted to share with you guys a small Python script I wrote that will perform a VACUUM VERBOSE ANALYZE on every table within a database. You will need to get psycopg2 installed from PyPi first: pip install psycopg2 At which point you should be able to use the below script with the correct environment variables to vacuum your database:

Raspberry Pi – cleverbot voice communication

Using my first generation Raspberry Pi and a few USB / analog devices, i’ve been able to create (a rather slow) cleverbot voice communicator. The reason for the slow down is initialization and listening on the USB microphone, but other than that everything works as expected. #!/usr/bin/env python import speech_recognition as sr import pyttsx import cleverbot print 'Initializing, please wait...' # define our cleverbot cb = cleverbot.Cleverbot() # speech recognizer setup r = sr.

Janky Lego stop motion

Well the kids have lost interest in Raspberry Pi Python programming for now, but look who’s still at it! The jankyiest of Lego stop motions. Here was the code I tossed together to make the gif above: #!/usr/bin/env python2 import os import time import shutil import datetime import tempfile import pygame.camera import pygame.image import RPi.GPIO as GPIO save_dir = '/usr/share/nginx/www' GPIO.setmode(GPIO.BCM) GPIO.setwarnings(False) GPIO.cleanup() GPIO.setup(17, GPIO.IN) pygame.camera.init() camera = pygame.camera.Camera('/dev/video0') def make_picture(filename): raw_input('Ready for picture?

GeoDjango and Taco Bell

I’ve been at it again with GeoDjango , this time I’ve pulled data on all Taco Bells locations from a popular social media site, took that data and added to a Django project, and finally plotted them in a view using Google Maps : Wow, that is a lot of Taco Bells ! Since this is Django, we are also able to view and edit from the admin :

Using GeoDjango to filter by Points

Just recently I found myself playing with GeoDjango , I’ve been using it on both a Ubuntu 14.04 cloud server and a Macbook Pro (OS X El Capitan). GeoDjango allows us to query by geographic points directly on the data model. We are then able to extend the model, and add a custom method to search by zipcode. Using the Django shell we can easily check data in our favorite interpreter :

Simple EC2 Instance + Route53 DNS

If you have a multi-environment AWS setup, and want a easy way to resolve all EC2 instance using Route53 DNS, look no further! Currently I’m maintaining a production and staging environment on Amazon Web Services across multiple regions. We tend to not use ElasticIPs as that just increases cost, plus internally we resolve using Consul . There is one drawback with not using ElasticIPs, when ever the instance restarts they will be offered a new dynamic IP (we will solve this with automation).

C# applications deployed with Docker and Mono

Lately I’ve been working a lot with Mono , and building C# applications on Linux. Just recently I discovered the official mono image in the Docker Hub Repo . This image comes with xbuild and NuGet (tools we need for building). So lets do a little work and get a mono application up and running (note I’m using a company application and will remove any references that may be sensitive.) I start by pulling the application’s source code down beside the Dockerfile :