Cymen Vig

Software Craftsman

Performing the Refactoring Kata, pairing with Josh and the pairing tour

Preforming the Refactoring Kata

At the 8th Light University on Friday I presented the Refactoring kata that Doug (my mentor) has been working on and thinking about for some time. The material comes from the beginning of the Refactoring book by Martin Fowler and other authors. The example code to be refactored is a simple video rental system with three classes: Movie, Rental and Customer. The focus is on a statement method in the Customer class that creates a billing statement for a customer.

It was a lot of fun to present although it was tricky to keep focused on both the refactoring and the dialog. It was a good experience to present again in front of everyone as I will have to do this when I present my apprenticeship challenge(s) on July 6th.

Pairing with Josh

On Friday I also chose my apprenticeship review committee via random selection. Over the next three weeks I will tour among the craftsmen on my committee. Josh is on the committee and while I was scheduling the tour he suggested pairing during Waza or free afternoons on Fridays to work on open source projects. Josh is working on a readme code sample validator. Say you have a project in Ruby with a readme in markdown and it has code samples. How can you ensure that your samples work particularly as your project changes over time? Well that is what Josh is aiming to solve and it was a fun project to work on.

The pairing tour

As mentioned above, the next three weeks will be spent touring with the craftsmen on my apprenticeship review committee. Then I will have two weeks to work on my apprentice challenges that culminates with a presentation on July 6th. The details of the challenges are very sketchy – I won’t know what they are until the Friday before those two weeks. In talking with other apprentices, it’s clear that starting as soon as possible is a very good idea as sometimes things take longer than expected. To be clear, everyone has been careful to follow the spirit of the challenges and not reveal any details. I’m looking forward to the challenges but I’m happy the pairing touring is next!

Comments and Reactions

Day 92: Internal project and kata practice

Internal project

I spent the day working on a form in Backbone to send an email that allowed editing the To, Subject and Body. The Rails action responsible for the actual sending also attached a number of PDFs to the email. This turned out to be fairly straight forward and is mostly complete. Mike and I paired for part of the day on this and we styled the email form very nicely. It’s not perfect but it look good and it uses good HTML markup (a DL with DT for the labels and DD for the input/display). I’m a big fan of definition lists for forms primarily due to working with Nick at Webitects.

Kata practice

I’m continuing to practice the kata. The steps are getting smooth although it is hard to remember to run the tests that confirm the incremental refactoring step is successful. Most Katas use test-driven development with a red-green cycle of writing a tests, running the test to see it fail and then writing the code to get the test to pass. The refactoring kata is focused on keeping the tests green while refactoring the code. This kata feels different too because the narration is important as starting with existing code with three classes is a fair amount of context. It should be interesting to perform.

Comments and Reactions

Internal projects, another kata in progress and soon a pairing tour

Internal projects

The internal project I’ve been working on for quite a while with other apprentices at 8th Light has now been deployed! The project should help quite a bit with some of the regular business work that needs to be done (I’m not sure how much detail I can go into). There is still some work to do on the project but now that it will be in actual use it should help drive the direction of the project more effectively.

You may recall that I mentioned quite a while ago the internal project splitting into two projects. The other project hasn’t required as much work but there have been one or two things mentioned that could be added. This weekend I finally got around to adding one of those features and it was fun to work with CSS and jQuery animations. I changed the display of three columns with a toggle-able fourth into a fluid layout. That seemed to work better when adding another column as the fixed width was getting too narrow but I didn’t want to increase the fixed width. A fluid layout means resizing the browser resizes the content. I’m curious to see how it works out.

Another kata

I’ve been working on another kata that Doug has been interested in for some time. It’s always a struggle to get a video of it prepared in time for review before it’s time to actual do the kata. I originally planned on doing that this weekend but it was time for a break and the sand and surf of Warren Dunes in Michigan was too tempting to pass up. Instead I finished the video today. This time it is well within the allotted time limit and it should be straight forward to shave another minute.

Soon a pairing tour

This is my last regular week at 8th Light. The next three weeks after this one are scheduled to be touring weeks where I paired with craftsmen at 8th Light for two days at a time. I’m looking forward to this quite a bit except for one thing: vim configurations.

Everyone here who uses vim seems to have their own tweaks and bindings. It would be wonderful if there was an agreed upon standard pairing configuration that apprentices could use while coming on board and that craftsmen would agree to use while pairing with apprentices (and perhaps each other). One of the reasons I’ve avoided tweaking vim in the past is that quite often I’d be using vim on a system which had no configuration and which I was not allowed to or would not want to customize the configuration. That really is a minor thing but we’ll see how that goes!

The parts I’m looking forward to off hand are:

  • seeing the types of projects being worked on
  • the approach of the craftsmen to development
  • the format of the stories being worked on and how they are verified as accepted

There is of course more that I’m missing. On the flip side, I too am likely under observation so it should prove to be an interesting experience.

Comments and Reactions

Designing a bare bones CMS for ASP.NET MVC

At my prior position one of my ongoing tasks was to create a CMS on top of ASP.NET MVC. If you’re not familiar with it, ASP.NET MVC is very similar to Ruby on Rails without the magic and without any prescribed data access method (so no equivalent to Active Record in the box).

Now an important point is that this was not the first CMS my employer had gone through the process of creating. They already had one that was created in ASP/ASP.NET. It had some performance issues. One of these was that the performance of the data access method via an ORM was not well understood or monitored by the developers. One day I dove into this issue and discovered two facts:

  1. Performance increases by adding indexes where appropriate based on analysis of the queries in the SQL Server engine were easy to obtain.

  2. There was a fatal performance flaw: the CMS concatenated keys and then did lookups or joins by parts of the concatenated key. So like a composite or compound key yet the key was actually stored as a key and queried upon as a whole targeting subsets of the key value!

It is almost impossible to optimize for the second issue. It didn’t look like it would cause the websites to fall over but it did increase the load on the SQL server and increase latency in response. This was typically a negligible amount but once you had a handful or more of these sites hitting the same database and occasionally getting indexed by a search engine all bets were off.

Based on this experience, one of the requirements of my CMS was to be very very simple. So no ORM and no more complexity than necessary. These restrictions, particularly the ORM one, where often annoying however in the end I think the CMS we created accomplished these goals. It certainly had some issues with how it worked that required some ingenuity to resolve however the performance was great and the code base was small. It certainly had less features than the old CMS however as those of us using ASP.NET MVC became more familiar with the MVC approach it became clear that making the CMS implement complex requirements may not be the best idea. Of course selling developers still on the old CMS on that idea was difficult. They would need to see this for themselves by working more with ASP.NET MVC. In truth, I suspect it was a bit of both the idea that maybe the CMS shouldn’t provide everything and that it could provide a bit more than it currently did.

I will continue this series on the CMS with some background on the data access method , using the nested set model in SQL to find related pages at the same depth and branch in navigation and other interesting tidbits that came up while working on the project.

Comments and Reactions

Recent experiences pairing, using Amazon S3 with aws-s3 gem, another kata

Recent experiences pairing

This week I’ve paired with Angeleah, Michael, Mike and Ben. We’re all working on the internal project. Angeleah and I worked on a story that was not completed in the prior iteration. The reason it wasn’t completed was that the interface wasn’t done. We had some minimal HTML with input fields while the mock up shown to the client had an edit link next to each field that could be edited and looked much nicer. When we tried to actually implement this we discovered it was very annoying to have an edit link. The work flow would be:

  • Click edit
  • Make change
  • Click save
  • Repeat for each field that needs to be edited

We discussed how we felt it should work and what are options were. We decided that it would be much nicer to have an edit view that toggled all the fields to editable status and then one would click save and return to the non-edit view (which could then be submitted or edited yet again). We checked with our customer via email for approval to this change and then went forward with it. The end result was a much nicer work flow and we brought over some style changes that resulted in a much improved appearance.

Michael and I worked on a couple of stories. One of the more interesting ones was an export feature. Michael had written the backend already with Ben but we needed a front end for it. It was a new feature on the site and starting something new even when it doesn’t seem that big always has the potential for taking more time than expected. It was a pleasant surprise to test-drive it to completion in much less time than estimated.

Mike and I worked on an export feature that required an oddly formatted CSV file for import into a proprietary application. This was frustrating as the import would do odd things like sum up two lines and create one import item instead of two import items. We had to try many variations of the file format to come up with one that did what we wanted. Mike spent a lot more time than I did on this story (along with Michael) and I was very thankful for that.

Using Amazon S3 with aws-s3 gem

Ben and I paired on a story that required we make some small test scripts to upload and then retrieve files from Amazon S3 securely. We used my Amazon Web Services account and the aws-s3 gem with Ruby and it was extremely straight forward. The file upload was very easy. Instead of making a download script, we made one that would print out a URL to the file on S3 with an access token that granted 30 seconds from the time the token was generated. This was also very easy and straight forward.

I spent some time considering how we would use this and realized that if we have a list of resources that can be clicked on we don’t want to have to generate the S3 URL with the access token when the list is made. The reason for this is the aws-s3 gem requires a connection to generate the URL so apparently some kind of interaction is required with the S3 API and generating a lot of URLs for resources may be costly in time. A simple solution to this is to have the resource point to our server which then creates the S3 URL on demand and redirects to it.

Another Kata

I’m working on another kata. Doug had an interesting idea. I’ve been trying to figure out the right tools. It is taking longer than expected for that part and I know I should already be much further along. I really need to figure this out by the end of the day tomorrow.

Comments and Reactions

Update on internal project

The work on the internal project continues. We’re almost at a point where it is usable. This last iteration was the first one in which a story was not completed. It was disappointing but ultimately understandable. During the iteration we accomplished a few things that didn’t have a specific story but really needed to be done. One thing in specific was modifying how Jasmine was testing all of our Backbone.js code.

Jasmine can load HTML fixtures into a specific DOM area for use in tests. The default path for these fixtures is relative to the Jasmine installation. In our case, we use a lot of mustache templates. So we’d copy over the templates but changes in the “real” templates would have to be manually copied over. This was really bad because our tests were not useful. We modified Jasmine to point to the rails view directory and updated all the fixture inclusions to use the new paths and it worked out great!

In our project, Jasmine is installed at ./spec/javascript/ which contains the helpers and suport directory along with any other directories one might want to create. We added a file called backbone-views.js to the helpers directory:

Then a call to loadFixtures would look like:

It is a small thing but clearly became essential over time.

Comments and Reactions

Day 65: A chance to try wkhtmltopdf to quickly create PDF files

About a year ago I came across wkhtmltopdf and thought it sounded wonderful. There are many ways of creating PDFs however almost all are very painful. The idea with wkhtmltopdf is to make a PDF based on HTML including CSS and JavaScript. The program includes the full webkit rendering engine and has some tweaks to force page pagination.

When I came across it last, we already had a solution in place. It wasn’t pretty but it worked. For the internal project at 8th Light I looked at another alternative that was already in use: Prawn. The problem with Prawn is the same as the problem with most PDF generation techniques: you have to learn a new language or custom language and while it may allow you to have exact pixel output it also requires a substantial investment in time. Our designers already know CSS and HTML and we didn’t need pixel perfect output so I picked the PDFKit gem (another choice is wicked_pdf which I’d also like to try) and got to work with a sample mock up of HTML and CSS along with web fonts from our designer Chris.

The biggest problem I ran into was that wkhtmltopdf requires absolute paths for all the external resources. PDFKit has a method to load stylesheets that uses a relative path but for all the images, CSS and web fonts I’d need to use a view helper to write absolute URLs. With that problem solved I had a working PDF. Well, sort of… The version of wkhtmltopdf I’m using is an older release and it didn’t seem to like SVG web fonts. Thankfully, the SVG variant wasn’t needed so I was able to remove it from the HTML presented to wkhmtltopdf.

The generated PDF looks great! It includes the web fonts and the background images as desired. The sample logo definitely looks a bit fuzzier however it is low resolution so once we substitute a higher resolution one or use web fonts to create the logo we’ll have a great looking PDF in a minimal amount of time. There certainly are trade-offs with this approach but I have a hard time imagining doing anything else until how the PDF should look is completely nailed down.

Comments and Reactions

Day 61: Another iteration meeting on the internal project, Patrick presents at 8LU

Another iteration meeting on the internal project

We had another iteration meeting for the internal project. We completed all the stories successful for the iteration. There were one or two design changes related to stories but the client and the designer made a side deal to take care of the minor changes for 0 points in this iteration.

Our simple syncing approach is working well. There are some obvious ways to optimize it: update the API to have an option to only return data modified since a certain timestamp. All of the records we’re getting have Active Record timestamps which means they have created_at and updated_at on every record. It is easy however we’d need to make sure the API was time zone agnostic so perhaps require date times in GMT. For now though I think we made the right choice: there isn’t that much data to sync so brute force is fine.

Patrick presents

Patrick presented “Distributing Jobs And Data For System Scalability” as the final part of his apprentice challenge. It was a great presentation and a lot of fun. He incorporated audience participation as part of his presentation (as did the most recent two apprentice presentations by Stephanie and Wai Lee).

Patrick started a simple Rails site on his laptop that presented each person with a single character input box and a submit button. Then we all tried to spell out “helloworld” without a typo. On typo, the attempt restarted. It was very hard to do and even in later rounds when we were allowed to communicate with each other it still took a while. The optimal approach at that time would have been to give 10 people one letter each and have them go in order. That way they could pre-populate the input form and submit in order quickly. In any case, it was a lot of fun and an interesting presentation.

Comments and Reactions

Day 60: syncation continues

Today I paired with Mark to finish up the work on syncing (really a one way pull) of some data from one system to another system. It was mostly straight forward however we ran into some odd bugs and learned that .to_json takes a :method argument that specifies what methods should be run on the object being converted to JSON. Otherwise it ignores those methods and you get nil values which isn’t fun at all. Thanks to Kevin for pointing that out to us!

Comments and Reactions

Day 59: More work on syncing between internal projects

On Wednesday, I worked with Darius on the syncing between some internal projects. For our main project we want to retrieve some data from another system. That system has a wrapper API on it. The API needed to be expanded to return more types of data for the project we were working. So we needed to modify the remote system, modify the code behind the API on the remote system and then modify the gem that can be used to abstract connecting to the API via HTTP to retrieve records. Then we needed to use the gem to actually pull remote data.

There are a lot of steps here and a lot of places to go wrong. The gem that sits between the systems was creating it’s own Ruby classes. The initial version was very simple. A class would look similar to this:

That was fine but more complex records required more accessors and more line of boiler plate code to populate the object attributes. And that kind of code unless it is generated is a hassle. Did you forget one of 13 fields? Are you going to test every single field? It becomes a lot of work and it’s just something one is going to get wrong eventually and it is likely going to cause odd errors.

I refactored this code by creating a base class:

Now the Person class looks like this:

And the much larger objects are similar – just a list of attributes that are expected to come from the JSON request on one side and be populated into our object that is passed off to the gem consumer. And now it is easy for the gem consumer to call .attributes on the object and use that for populating its own class instance that needs this data.

Comments and Reactions

rspec: How to mock a method and return one or more the arguments to the mocked method

Here is our example class:

And here is how we can mock the convert method and return one or more the arguments to convert:

Comments and Reactions

Day 58: Working on a Rails-based client site, thoughts on ideal backend for Backbone.js

Working on a Rails-based client site

Today I worked with Doug on a client site written in Rails. It was interesting to see a larger code base that makes more use of Rails features (helpers, etc) and testing those features. I wrote a script to import a bunch of data from a CSV file and wrote a migration to update the database schema to accommodate storage of the new data. I modifid the seed data generation script to be cleaner and generate more data so there is data present where additions where made in the schema.

Thoughts on ideal backend for Backbone.js

The problem with using an MVC backend for Backbone.js is that typically the controllers are DRY but also do things that are un-DRY in order to make things easier. For example, say you have an orders controller that pulls up order history. You’ll probably need more than just order data – you’ll likely need to join it a customer, product, address or some other table. The temptation which makes complete sense is to have that controller retrieve the related data even though it is out of the scope of that controller. There are of course ways of working around this in MVC frameworks but you’re still trying to shim another responsibility onto a controller action that is unrelated to it’s primary responsibility.

What is interesting about Backbone.js is that you can have it handle requesting completely unrelated data in order to build views without compromising the primary responsibility of your controllers in MVC. The most simplistic example of this that I can think of is needing to display multiple choices in a dropdown for a field on a form. The choices might be in one table and they are related to the model behind the form via a foreign key. This is a pain point in the typical MVC framework. The controller needs to load the model for the form but also the related choices from somewhere else…

The odd thing though is that if you take this approach with Backbone.js and have it be responsible for combining the model that is being edited along with the choices in the select list in a client-side view everything becomes very DRY. However it also makes one think that perhaps an MVC framework is immense overkill as a backend for Backbone.js.

What does Backbone.js really need on the backend?

Bootstrap: When you first hit a Backbone.js page the recommended approach is to serve an HTML page with the JSON-encoded data (collections of models) within that HTML. In other words the server takes all the data you initially need for your view and encodes it into a string that is put within the HTTP response to your browser. When Backbone.js loads on the client it converts that string into collections of models. The point is to decrease startup time by avoiding having the page load then having Backbone.js fire off requests for additional data. That way the application is fast and responsive.

Running: When a Backbone.js application is running in the browser it can be very economical in terms of communication with the server. It can also communicate in a way that is very DRY: each model/collection has a configured URL that the backend server can respond to and so each update can be posted separately keeping clean separation of concerns on the backend.

So what?

Well when using MVC it feels like the controllers are functional in the running stage but the bootstrap is painful. It also feels like during the running stage the controllers are immense overkill. During the bootstrap, all that work to remain DRY gets shot in the foot in the name of performance. That performance is critical but it also makes MVC feel like the wrong backend.

Going forward…

I’m still thinking about what the ideal backend would look like. I think it could be much simpler than an MVC framework. The controllers are much simpler as they are REST endpoints. Security is an important concern that has to remain in the forefront. But what would work well for both the bootstrap and the running phases? Hrm…

Comments and Reactions