Posts

Toy Data Center update

Image
It's been a while since I've written about my toy data center.  I started with two Intel NUCs and shortly thereafter expanded to four.  Each of the first pair has a 240 G SSD and the second pair each sports a 480 G SSD. All running Ubuntu 14.04.

Simple Python __str__(self) method for use during development

For my development work I want a simple way to display the data in an object instance without having to modify the __str__(self) method every time I add, delete, or rename members. Here's a technique I've adopted that relies on the fact that every object stores all of its members in a dictionary called self.__dict__ . Making a string representation of the object is just a matter of returning a string representation of __dict__ . This can be achieved in several ways. One of them is simply str(self.__dict__) and the other uses the JSON serializer json.dumps() , which lets you prettyprint the result. Here's a little Python demonstrator program: # /usr/bin/python """ demo - demonstrate a simple technique to display text representations     of Python objects using the __dict__ member and a json serializer.     $Id: demo.py,v 1.3 2015/07/18 13:07:15 marc Exp marc $ """ import json class something(object):     """ This is just a demon...

Economical NUC desktop running Ubuntu

Image
The TV in the kitchen has long had a Mac Mini attached to one of its inputs. We used it to watch Youtube videos, listen to music from iTunes and Google Music, to browse the web, to show photographs from our trips, and so on. Sadly, the little Mini passed away earlier this year, refusing to power up. When we priced out replacement machines we discovered that the new Minis were a lot more expensive, even if a the same time more capable. Given that we were not planning to store lots of data on the machine, we decided to leverage the lessons we had learned from building our little collection of NUC servers and design and build a small desktop on one of the NUC engines. We conducted some research and selected a machine sporting an i3 processor. The parts list we ended up with was: Intel NUC DCCP847DYE [1 @ $ 146.22] Intel Core i3 Processor Crucial CT120M500SSD3 [1 @ $ 72.09] 120GB mSATA SSD Crucial CT25664BF160B [2 @ $ 20.97] 2GB DDR3 1600 SODIMM 204-Pin 1.35V/1.5V Memory ...

Adding CPUInfo to Sysinfo

There is a lot of interesting information about the processor hardware in /proc/cpuinfo. Here is a little bit from one of my NUC servers: processor : 0 vendor_id : GenuineIntel cpu family : 6 model : 69 model name : Intel(R) Core(TM) i5-4250U CPU @ 1.30GHz stepping : 1 microcode : 0x16 cpu MHz : 779.000 cache size : 3072 KB physical id : 0 siblings : 4 core id : 0 cpu cores : 2 apicid : 0 initial apicid : 0 fpu : yes fpu_exception : yes cpuid level : 13 wp : yes flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx pdpe1gb rdtscp lm constant_tsc arch_perfmon pebs bts rep_good nopl xtopology nonstop_tsc aperfmperf eagerfpu pni pclmulqdq dtes64 monitor ds_cpl vmx est tm2 ssse3 fma cx16 xtpr pdcm pcid sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand lahf_lm abm ida arat epb xsaveopt pln pts dtherm tpr_shadow vnmi flexpriority ept vpid fsgsbase tsc_adjust bmi1 avx2 smep...

JSON output from DF

So I'm adding more capabilities to my sysinfo.py program. The next thing that I want to do is get a JSON result from df . This is a function whose description, from the man page, says "report file system disk space usage". Here is a sample of the output of df for one of my systems: Filesystem 1K-blocks Used Available Use% Mounted on /dev/mapper/flapjack-root 959088096 3802732 906566516 1% / udev 1011376 4 1011372 1% /dev tmpfs 204092 288 203804 1% /run none 5120 0 5120 0% /run/lock none 1020452 0 1020452 0% /run/shm /dev/sda1 233191 50734 170016 23% /boot So I started by writing a little Python program that used the subprocess.check_output() method to capture the output of df . This went through various iterations and ended up with this single line of python code, which requires eleven lines...

Automatic Inventory

Now I have four machines.  Keeping them in sync is the challenge.  Worse yet, knowing whether they are in sync or out of sync is a challenge. So the first step is to make a tool to inventory each machine.  In order to use the inventory utility in a scalable way, I want to design it to produce machine-readable results so that I can easily incorporate them into whatever I need. What I want is a representation that is both friendly to humans and to computers.  This suggests a self-describing text representation like XML or JSON.  After a little thought I picked JSON. What sorts of things do I want to know about the machine?  Well, let's start with the hardware and the operating system software plus things like the quantity of RAM and other system resources.  Some of that information is available from uname and other is availble from the sysinfo(2) function. To get the information from the sysinfo(2) function I had to do several things: Install sysinfo on each machine sudo apt-get ...

Log consolidation

Image
Well, my nice DNS service with two secondaries and a primary is all well and good, but my logs are now scattered across three machines. If I want to play with the stats or diagnose a problem or see when something went wrong, I now have to grep around on three different machines. Obviously I could consolidate the logs using syslog. That's what it's designed for, so why don't I do that. Let's see what I have to do to make that work properly: Set up rsyslogd on flapjack to properly stash the DNS messages Set up DNS on flapjack to log to syslog Set up the rsyslogd service on flapjack to receive syslog messages over the network Set up rsyslog on waffle to forward dns log messages to flapjack Set up rsyslog on pancake to forward dns log messages to flapjack Set up the DNS secondary configurations to use syslog instead of local logs Distribute the updates and restart the secondaries Test everything A side benefit of using syslog to accumulate my dns logs is tha...