The Data-Driven Resume

D3 is amazing. Once I saw a few demos I knew I had to learn it, but what should I build first? It’s always more fun to solve a real problem. It hit me that a truly visual, data-driven resume for developers is way overdue. I’d found my project.

You can see a working demo here and the source here. The rest of this post describes the basics of how it’s put together.

Starting with JSON Resume

I found a promising project called JSON Resume, which made a great starting point for an interactive, data-driven presentation of developer experience. This nifty project defines a standard JSON schema for the contents of a resume. It lets you define the content of your resume as a JSON document, then you can apply any kind of presentation to it. See a nice gallery here.

I added a section to each work experience entry for projects. Each project has a name, description, start and end dates, and arrays of roles, languages, and tools. This is the detail that enables all of the visualizations and interaction.

projectJSON

Pick a Theme

I liked the Kwan theme. I converted it from Node and Handlebars to AngularJS. This gave me a good starting point to build around.

Categorizing and Normalizing Project Dates

Before I build any of the charts, I had to iterate through all of the projects, sort chronologically, and allocate timespans to each language and tool. I then build the data structures each of the charts expects.

Roles over Time

Roles

To show the career timeline with the various roles typical developers play, I started with d3-timeline. I had to make only minor tweaks to adapt it from hours to years. These was a nice hover feature that I used to show the relevant project at each time point for each bar.

Area Charts

Skills/Languages Area Chart

Skills/Languages Area Chart

An area chart communicates the flow of skills acquisition over time. This shows more than the typical “X years of Y” table and shows how the experience was gained over time. I added filters so you can limit just to particular roles.

Future Enhancements

The layout and graphics could use a designer’s touch. I think libraries should be broken out from tools. We can probably make better use of project descriptions. There is room for enhancing the JSON Resume to describe more detail about the strengths of the developer and the kinds of teams and roles sought. I think this is just scratching the surface of how interactive graphics can tell the story of each developer’s experience and direction.

The code is on GitHub. Try it with your own resume.

Confusion Over Structs

I was recently perusing an article called C# developer interview questions and answers. I do a lot of interviews of developers with C# experience; so, I like to see what others think are good questions. The article was generally good, but there was this…

image

Good question. Good first sentence. Then things start to go down-hill.

First let’s look at the claim that “Structs are passed by value and not by reference.” This is technically true, but betrays a superficial understanding of the language. In C#, all parameters are are passed by value. It just so happens that the value of a reference type is in fact a reference. If that doesn’t make sense, read this article by Jon Skeet. A better way to state the point would be: “Structs are value types, while classes are reference types.” This would also cover the part about not being able to inherit from structs because all value types are sealed.

Next we have “Structs are stored on the stack not heap.” This is false. Look at this code:

image

Where will the Point inside the Shape be stored? Clearly on the heap. It is true that value types declared as local variables, even though they may be newed up, are still allocated on the executing thread’s stack space…

image

Now we get to the best of all… “The result is better performance with Structs.” If it were that simple we would always use structs and wouldn’t even need classes. The reality is that some objects are better modeled as classes and some are better modeled as structs. Discussing what kinds of objects are best modeled as structs would be a great question.

Lightweight Context/Specification BDD in C#

Behavior-Driven Development (BDD) provides all of the engineering benefits of traditional Test-Driven Development (TDD) while additionally resulting in a specification that non-developers can read and validate. At its heart, BDD transforms the tests of TDD into specifications. Those specifications are expressed in English sentences that are expressed in business value as opposed to coding or engineering terms.

The most popular structure for BDD today is called the Gherkin format and follows a Given/When/Then format, like…

Given a new bowling game
When all frames are strikes
Then the score should be 300

There are frameworks like SpecFlow to help you arrange your specifications (test) into this format. However, I find this format awkward and forced. I prefer the simpler format known as Context/Specification (aka When/Should)…

When all frames are strikes
Should have a score of 300

There are frameworks, like MSpec, that attempt to make the specifications read more like English sentences. However, I find that these frameworks get in the way as much as they help. It is also nice to be able to write readable tests with just PONU (plain old NUnit). Over time, I’ve developed a convention that I find easy to write and easy to read. I’ve also developed a tool  that turns the tests into a markdown file that can be turned into a pretty HTML report.

To show the approach at work, I present some snippets from a hypothetical order pricing system I created as a “developer test” provided by a prospective employer last summer. Here is what I was given:

Instructions: Build a system that will meet the following requirements. You may make assumptions if any requirements are ambiguous or vague but you must state the assumptions in your submission.

Overview: You will be building an order calculator that will provide tax and totals. The calculator will need to account for promotions, coupons, various tax rule, etc… You may assume that the database and data-access is already developed and may mock the data-access system. No UI elements will be built for this test.

Main Business Entities:

  • Order: A set of products purchased by a customer.
  • Product: A specific item a customer may purchase.
  • Coupon: A discount for a specific product valid for a specified date range.
  • Promotion: A business wide discount on all products valid for a specified date range.

*Not all entities are listed – you may need to create additional models to complete the system.

Business Rules:

  • Tax is calculated per state as one of the following:
    • A simple percentage of the order total.
    • A flat amount per sale.
  • Products categorized as ‘Luxury Items’ are taxed at twice the normal rate in the following states
    • FL
    • NC
    • CA
  • Tax is normally calculated after applying coupons and promotional discounts. However, in the following states, the tax must be calculated prior to applying the discount:
    • FL
    • NM
    • NV
  • In CA, military members do not pay tax.

Requirements:

Adhering to the business rules stated previously:

  • The system shall calculate the total cost of an order.
  • The system shall calculate the pre-tax cost of an order.
  • The system shall calculate the tax amount of an order.

Deliverables:

  • A .NET solution (you may choose either C# or VB) containing the source code implementing the business rules.
  • Unit tests (you may choose the unit testing framework).
  • A list of assumptions made during the implementation and a relative assessment of risk associated with those assumptions.

You can see that there are quite a few specifications here. It’s a perfect scenario for a BDD approach. Lets take a look at the specification that most states charge taxes on the discounted price, while a few states require taxes to be calculated on the original price.

Here is a specification that a standard tax state calculates taxes on the discounted price…

namespace Acme.Tests.ConcerningCoupons
{
    [TestFixture]
    public class When_coupon_is_applied_to_item_on_order_in_standard_tax_state
    {
        private Order _order;
        [TestFixtureSetUp] public void Context()
        {
            Product product = new Product(10);
            Coupon coupon = CreateCoupon.For(product).WithDiscountOf(.5m);
            _order = CreateOrder.Of(product).Apply(coupon).In(StateOf.NC);
        }

        [Test] public void Should_calculate_tax_on_discounted_price()
        {
            _order.Tax.ShouldEqual(.25m);
        }
    }
}

You can see that the test fixture class name defines the context (the when) The test method name specifies the specification (the should). The Context method sets us the context in the class name. Also note the ConcerningCoupons in the namespace. This allows us to categorize the specification.

Here is the code that specifies the prediscount tax states…

namespace Acme.Tests.ConcerningCoupons
{
    [TestFixture]
    public class When_coupon_is_applied_to_item_on_order_in_prediscount_tax_state
    {
        private Order _order;
        [TestFixtureSetUp] public void Context()
        {
            Product product = new Product(10);
            Coupon coupon = CreateCoupon.For(product).WithDiscountOf(.5m);
            _order = CreateOrder.Of(product).Apply(coupon).In(StateOf.FL);
        }

        [Test] public void Should_calculate_tax_on_full_price()
        {
            _order.Tax.ShouldEqual(.50m);
        }
    }
}

Now take a look at a section of the report generated from the tests…

orders

Anyone can now compare the generated report to the original specification to verify we hit the mark. It’s a little more work to structure your tests this way, but the benefits are worth it.

The full source for the sample and the report generator are available here.

A Nifty Extension Method for Mapping Property Values by Name

Here is a nice little C# extension method that allows you to copy all of the properties from one object to another where the names match.

SetPropertiesByName(this object target, object source)
{
PropertyInfo[] sourceProperties = source.GetType().GetProperties();
PropertyInfo[] targetProperties = target.GetType().GetProperties();
foreach (PropertyInfo targetField in targetProperties)
{
PropertyInfo sourceProperty =
(sourceProperties.Where(f => f.Name == targetField.Name)).FirstOrDefault();

if (sourceProperty.IsNotNull())
{
targetField.SetValue(target, sourceProperty.GetValue(source, null), null);
}
}
}

The usage looks this…

foo.SetPropertiesByName(bar);

I was using AutoMapper for this, but it was overkill for what I needed.

Spell-Checking for ReSharper

Your code will be written once. It will be read (by you and others) many times. Code should be easy to read. This means that I should be able to scan a screen full of code and get a sense of what it does quickly. Layout and spacing plays s big part in that. Naming things with real, recognizable words just as important.

If you buy that, it goes without saying that those words that make up your identifiers must be spelled correctly. I am a terrible speller. As a C++ coder, I loved that Visual Assist would give me the red squiggles under any word that I misspelled. This has been the only regret I have about moving from VA to ReSharper. In every other way, ReSharper is a win over what VA offered.

I’ve just discovered Agent Smith as a plug-in to ReSharper. I had to make a few tweaks and drop one of the checking rules, but I am loving it. My ReSharper experience is now complete!