Friday, June 15, 2012
Invitation: Gnome/Gtk+/GObject Study and Hack Group
Since I use Gnome on Linux pretty exclusively these days (and I'd like to continue), I'd like to start a study/hack group for Gnome/Gtk+/GObject programming.
I've been watching the GObject platform change quite a bit over the last few years. Most significant, in my opinion, has been the introduction of GObject introspection (GI), which essentially made it possible to use any C+GObject library in any language, as long as there's a GI binding for that language (and there are for Java, JavaScript, .Net (Mono), Python, Ruby, and more).
My goals are (1) to be comfortable enough with GI programming to quickly create applications that I want, without the GI layer being the roadblock, (2) to be comfortable enough with C+GObject or Vala to translate potentially reusable pieces of GI applications into actually reusable components. And (3ish) if I can contribute to the examples, tutorials, and docs all the while, that's a win too.
So, if anyone is interested is interested in learning or practicing Gnome/Gtk+/GObject development in a group, hit me up @mjumbewu, or in the comments. Standing offer, so even if this post is several months old, hit me up.
Thanks!
- Mjumbe
Friday, February 10, 2012
Playing around with mapping & reducing
I've been doing a few experiments with mapping/reducing in Python using the multiprocessing module.
Some things I've learned
Some of these I suspected, but some were surprises to me.- Passing data between the parent and child processes is slow -- try not to pass big data.
- Splitting across multiple processes actually does speed up computation, as long as you don't have to pass back a lot of info about each thing.For example, this could be a good candidate for mapping to processes:
def square_all(nums): return sum([i*i for i in nums])
This, maybe not so much:
def square_all(nums): return [i*i for i in nums]
The first just returns a single number for each subsection of data. The latter returns a potentially large list.
- Using processes and queues directly is more flexible, but mapping over pools saves a lot of typing. For example, these functions accomplish similar things. The first is a function that just runs the processing on a single entire data set. The next two demonstrate running the data in multiple processes. In multi, it splits the data roughly evenly over n processes and immediately runs those processes in parallel. In pooled, it splits the data into n parts and doles each part out over p processes. When n == p, multi and pooled do basically the same thing.
Note that, in order to accommodate the Queue pattern in multi, I had to modify the function being called:
def square_all(nums, q=None): if q: q.put(sum([i*i for i in nums])) else: return sum([i*i for i in nums])
- Performance of the process pools was more erratic than the manually handled processes and queues. This may be due to memory constraints; for some reason, when my script got down to looping over the pools, the memory usage went way up. I took a screenshot of my system monitor running on half as much data.
- My computer performed about equally well above 3 processes when they were handled manually. I expected that 8 processes would not have worked out well, but I was wrong.
The script
Here's the source of the script that I used to run the experiment:#!/usr/bin/env python3 #-*- coding:utf-8 -*- from itertools import chain from math import floor from multiprocessing import Pool, Process, Queue # --------------------------------------------------------------------------- # The data to process. Why a global? No reason really. full_count = 1000000 full_range = range(full_count) # --------------------------------------------------------------------------- # The processing function. Something simple. def square_all(nums, q=None): """ Simple function to loop over and square all the numbers in ``nums``. """ if q: q.put(sum([i*i for i in nums])) else: return sum([i*i for i in nums]) # --------------------------------------------------------------------------- # The processing wrappers. def single(): """ Run the the function on the full range of values. """ result = square_all(full_range) return result def multi(n): """ Partition the range into n parts and run each in a separate process. """ # Partition parts = _partition(full_range, n) # Map queues = [Queue() for _ in range(n)] processes = [Process(target=square_all, args=(part, queue)) for part, queue in zip(parts, queues)] for process in processes: process.start() for process in processes: process.join() partial_results = [queue.get() for queue in queues] # Reduce result = _combine(partial_results) return result def pooled(n, p): """ Partition the range into n parts and run on a pool of p processes. """ # Partition parts = _partition(full_range, n) # Map pool = Pool(p) partial_results = pool.map(square_all, parts) # Reduce result = _combine(partial_results) return result # --------------------------------------------------------------------------- # A few utilities for partitioning and combining. def _partition(l, n): """ Partition the list l into n parts, and return a list of the parts. """ count = len(l) return [l[floor(i / n * count):floor((i + 1) / n * count)] for i in range(n)] def _combine(partial_results): """ Combine the list of partial results into a final result. """ result = sum(partial_results) return result def _time(f, reps=10, args=tuple(), kwargs=dict()): for _ in range(reps): t = time.time() f(*args, **kwargs) print (' ', time.time() - t) # --------------------------------------------------------------------------- # Here's the script to generate the results. if __name__ == '__main__': import time print('Time for single process:') _time(single) for n in range(2,9): print('Time for {} process:'.format(n)) _time(multi, args=(n,)) for p in (2,3,4): for n in (p,2*p): print('Time for {} partitions over {} process:'.format(n, p)) _time(pooled, args=(n, p))
Tuesday, November 1, 2011
Thoughts on my support of #occupy, as a software developer
- I also support the organization I'm working for, think they're doing good work, and feel that it is not incongruous with #occupy (though it's certainly not the same)
- I don't actually work in Oakland
- Help out #occupy tech support. I've come across requests from several #occupy communities for additional IT hands (I don't know whether there is a central place where these requests are posted). If you feel so inclined, volunteer a little time to maintaining an #occupy web site, or posting minutes, etc. (if anyone knows more about what tech needs #occupy communities have, please speak up).
- Help create logistical tools. The #occupy communities that I've seen are little micro-societies and, like any society, have a number of logistical considerations and concrete needs. Are there tools that could help? For instance, I liked the idea of needsoftheoccupiers which they described as "a wedding registry meets Freecycle for the Occupy movement".
- Do something creative/educational/inspirational. #Occupy has people paying [at least superficial] attention to issues of economic [in]equality and social participation. That doesn't happen too often, so take advantage of it by creating a visualization or an info-app that awakens peoples imagination and helps them see things that they wouldn't normally. Things like BankMigration map and Slavery Footprint come to mind.
Update:
Something else that could perhaps use some help:
- @benzotweet is "trying to develop a database solution for occupy... crazy requirements! no training, role based security, decentralized... #wiki?"
"This isn't a protest, it's a revolutionary movement empowering people to create their own change. ...We're trying to encourage people to organize and create their own change outside of the existing establishment through direct action."Update 2:
An interesting idea for a project that came up last night: log tweets tagged with "#occupy..." that have pictures, pull out the geo exif data, and put it on a live-updating map ... kinda like http://www.artmapper.org/ (source at https://github.com/mertonium/muralmapper).
Update 3:
To stay abreast of tech-related opportunities to support #occupy, join the occupyhack googlegroup.
Friday, June 10, 2011
Generic Django Model Templates
Update 23 Jun 2011: I have renamed the django-model-filters
package django-model-blocks
. It now lets you easily override the template used for a given model. Check out the changes on Github or PyPI.
Tonight I'm writing my first Django custom filter. The problem I'm trying to solve is that I want generic templates. For a given model I want to be able to set up browseable index and detail pages with minimal effort. As it stands now, say I have the following model:
...
class PepulatorModel (models.Model):
serial_number = IntegerField()
height = IntegerField()
width = IntegerField()
manufacture_date = DateTimeField()
color = CharField(max_length=32)
def __unicode__(self):
return u'Pepulator #%s' % self.serial_number
...
Now say I want my users to be able to browse my pepulators in a simple way, with the following caveats:
- They cannot edit pepulators, only view (rules out
admin
app)
- I want to define the URL structure (rules out
databrowse
) to be something like:
http://www.mysite.com/pepulators/ http://www.mysite.com/pepulators/?color=red http://www.mysite.com/pepulators/012345/
- I want to specify the base template so that it integrates well with the rest of my project (also rules out
databrowse
)
Currently, I can use the generic views ListView
and DetailView
, but I still have to write templates that go something like this:
{% extends base.html %}
{% block content %}
<header>
<h1>Pepulator #{{ pepulator.serial_number }}</h1>
</header>
<div>
<span>Serial Number</span>
<p>{{ pepulator.serial_number }}</p>
</div>
<div>
<span>Height</span>
<p>{{ pepulator.height }}</p>
</div>
<div>
<span>Width</span>
<p>{{ pepulator.width }}</p>
</div>
<div>
<span>Manufacturer</span>
<p>{{ pepulator.manufacturer }}</p>
</div>
<div>
<span>Color</span>
<p>{{ pepulator.color }}</p>
</div>
{% endblock %}
Okay, a bit verbose, but it's not going to kill me. However, now say I want to change some of the fields on my model. Well, then I have to remember to change the fields in my template as well (error-prone — this is why you don't violate DRY without good reason).
All I wanted was a simple view of my model!
So, I considered making an app that was leaner than databrowser and just provided generic templates to go with generic views. I found myself having to extend the generic views anyway, though, because there's no way to access a model instance's fields and field names without explicitly feeding them to the template's context. Then, I gleaned some inspiration from uni_forms: I'll make filters!
Now my plan is to be able to say, using the example of the Pepulator detail view above:
{% extends base.html %}
{% block content %}
{{ pepulator|as_detail_html }}
{% endblock %}
Sublime. (This must exist somewhere; but for now, I can't find it.)
So, I start off by creating my app
$ python manage.py startapp generic_templates
Now, from the documentation on creating custom tags and filters, I see I should create a templatetags
direcotry in my app. In here I'll put an __init__.py
file and a module called generic_filters
. This way, when I'm done, to use the filters, I'll put near the top of my template file:
{% load generic_filters %}
I decided to start with the detail filter (as_detail_html
), and to write a test first. I know generally what I want this to do, so I write the following test:
"""
Test the generic filters
"""
import datetime
from django.test import TestCase
from mock import Mock
from django.db.models import Model, IntegerField, DateTimeField, CharField
from django.template import Context, Template
from generic_templates.templatetags import generic_filters as gf
class DetailHtmlFilterTest (TestCase):
def setUp(self):
# Create a sample model
class PepulatorModel (Model):
serial_number = IntegerField(primary_key=True)
height = IntegerField()
width = IntegerField()
manufacture_date = DateTimeField()
color = CharField(max_length=32)
def __unicode__(self):
return u'Pepulator #%s' % self.serial_number
# Create a model instance
now = datetime.datetime.now()
self.m = PepulatorModel(
serial_number = 123456,
height = 25,
width = 16,
manufacture_date = now,
color = 'chartreuse',
)
# Mock Django's get_template so that it doesn't load a real file;
# instead just return a template that allows us to verify the context
gf.get_template = Mock(
return_value=Template('{{ instance|safe }}:{{ fields|safe }}'))
def test_model_format(self):
"""Tests that a given model is formatted as expected."""
expected_detail = (u"Pepulator #123456:[('serial number', 123456),"
" ('height', 25), ('width', 16), ('manufacture date', %r),"
" ('color', 'chartreuse')]") % self.m.manufacture_date
detail = gf.as_detail_html(self.m)
gf.get_template.assert_called_with('object_detail.html')
self.assertEqual(detail, expected_detail)
In short, set up a model and an easy template, and check that the template is filled in correctly. Of course, since I haven't yet written my filter, this fails.
This (as_detail_html
) was a straightforward method to write, but I did get tripped up because of the poor documentation available on Model
s' Meta
classes. Here's the first go at the filter:
from django.template import Context, Template
from django.template.loader import get_template
def as_detail_html(instance):
"""
Template filter that returns the given instance as a template-formatted
block. Inserts two objects into the context:
``instance`` - The model instance
``fields`` - A list of (name, value)-pairs representing the instance's
fields
"""
template = get_template('object_detail.html')
fields = [(field.verbose_name, getattr(instance, field.name))
for field in instance._meta.fields]
context = Context({'instance':instance, 'fields':fields})
return template.render(context)
One other thing: I actually want to be able to use the filter in my templates, not call it directly in my code. I'm new here, so I write another test to make sure I understand what's going on:
def test_filter_is_registered(self):
"""Test that the filter can be used from within a template"""
template = Template(('{% load generic_filters %}'
'{{ pepulator|as_detail_html }}'))
context = Context({'pepulator':self.m})
expected_detail = (u"Pepulator #123456:[('serial number', 123456),"
" ('height', 25), ('width', 16), ('manufacture date', %r),"
" ('color', 'chartreuse')]") % self.m.manufacture_date
detail = template.render(context)
gf.get_template.assert_called_with('object_detail.html')
self.assertEqual(detail, expected_detail)
And it turns out all I have to do to satisfy it is change my module in the following way:
from django.template import Context, Template, Library
from django.template.loader import get_template
register = Library()
@register.filter
def as_detail_html(instance):
...
Now I have a working object detail template. Yay! I figure I'll do the list the same way.
More on Github: https://github.com/mjumbewu/django-model-filters
Monday, December 13, 2010
A PhillyCarShare API
Developers
The source is available on github at github.com/mjumbewu/pcs-api. It's written in Python to run on top of AppEngine (though it could theoretically be ported to Django with minimal effort). I have an instance up and running at phillycarshare-api-kwawatu.appspot.com/. The app name is purposefully gargantuan, as appengine names are permanent for now, and I didn't want to create any problems for the future.
I know everyone is tight on time, so if there's anything I could do that would make contributing easier, please let me know. I realize the documentation is a bit lacking right now, and I'm working on it :). If you have any questions at all, let me know in the comments, or shoot me a message by email or on twitter @mjumbewu
Non-developers
If you're not a developer but still want to help test the API or stuff developed with the API, or have any other questions, contact me as well.
Sunday, December 12, 2010
Beguiling Events — A Poem About Systems!
A system is a big black box
Of which we can't unlock the locks,
And all we can find out about
Is what goes in and what comes out.
Perceiving input-output pairs,
Related by parameters,
Permits us, sometimes, to relate
An input, output and a state.
If this relation's good and stable
Then to predict we may be able,
But if this fails us—heaven forbid!
We'll be compelled to force the lid!
Sunday, November 14, 2010
Whoo, BarCamp Philly!
In the morning there was the session on Weaving a Regional Mesh For Open Innovation. I think the content of the session could be summed up in a statement made by the presenter, Joe Raimondo: "The World needs R&D". It was more or less an open discussion on ways to foster and encourage innovation around real-world local problems. We touched on early education, higher education, mobilizing community [human] resources, traditional hierarchical organizational structures, and a host of other topics. Most of it wasn't terribly specific to this region, outside of the examples raised (as most people in the room were from the region). It was an awesome way to start off the day.
Then there was an OpenDataPhilly talk featuring Paul Wright, Mike Jewsbury, Mark Headd, and Stuart Alter (not much of a tweeter) from the DoT. There were some good questions brought up by the audience here, such as how Philly is addressing the issue of 2-way data streams (as opposed to just from government to citizens), and how they (we?) are approaching data and digital tool access with respect to the digital access divide in Philly. Not all the questions were answered in ways that were yet satisfactory to me, but these are hard problems and only recently starting to get worked out anywhere, so I'll cut the panelists some slack.
After that I chatted and mulled around for a while, and eventually ended up in Kris Walker's presentation, Internet as Platform. It was an apt refresher on open-web principles, showcasing the current state and trajectory of web [browser-based] platforms. If I can recall where his slides are hosted, I'll post it here. Update: Here's those slides
After a nice lunch at Good Dog with Corey Latislaw, Jason Cox, and Pam Selle, I intended to attend a talk entitled "JavaScript is Real Code" with Len Smith, but for whatever reason it was moved to a later time slot. Instead, I and a couple other folks stayed around for an impromptu session on jQTouch with Wil Doane, who shared samples and explanations from code that he uses to teach a class at Hudson Valley Community College. I wanna do that!
After a bit of shuffling around, deciding on my next session, I settled into Riot URLs: Gender Feminism and Tech with Maria Sciarrino and a room full of folks discussing women and men (and boys and girls) in tech. On one level, I enjoy sitting around talking about bias in tech (takes me back to my college days), but there's something weird about it too. Maybe because I feel it's too important an issue to be touched on in an hour-long discussion, which is often as far as people go with it. Maria did express a desire/make a suggestion that the discussion continue in some form. Update: Maybe a regular brunch?
Lastly, I sat for a few in Corey's last-minute addition to the schedule, a disscussion around coding for good. I wish I could have stayed for the entirety of this session, but I had my partner's birthday celebration to get to (trade one good time for another).
Still have some thoughts to digest, and I'll do so later, either here or over on Kwa Watu. Just want to say thanks again to the organizers, the volunteers, the sponsors, the presenters, and the attendees of this BarCamp Philly.