Follow Up: Stingray FCC Approval in Question

Per TechDirt’s coverage here, it appears the Stingray device made by Harris Corp. has actually only received FCC approval for use in “emergency situations”.  Despite this limited approval scope, Federal and local law enforcement agencies have been using it for routine investigations, without a warrant, and lying about it, while Harris Corp misleads the FCC.

Luckily, the ACLU is on the case, asking FCC chair Tom Wheeler to open an investigation into the usage of the Stingray device.

Again, we’ve arrested a software developer who merely created a product, yet we turn a blind eye to a corporation actively colluding with law enforcement to routinely violate the privacy of Americans in violation of regulation and the law.

Hopefully Tom Wheeler at the FCC will take this as seriously as he should, and perform a thorough investigation, though his background as a lobbyist for the corporate broadcasting industry hardly makes me hopefully that he will side with citizens and consumers.

As TechDirt notes, even if he does initiate an investigation, it will most likely be met with stonewalling from the agencies and corporations involved.  We need to put an end to the lawlessness that has been becoming more pervasive at every level of government, especially collusion and cronyism with corporate interests.

Conaway Update 2: YouTube Metadata Analysis

After discovering and immediately downloading backups of all of Frank M. Conaway, Jr’s videos for investigative purposes, I started probing YouTube’s API for video metadata via a Python script.  My goal was two-fold:

  1. Get exact timestamps for when the videos were published
  2. Get the original video files (if possible) to interrogate their metadata for when the videos were recorded (similar to the metadata on most mobile phone pictures that can include everything from device used, to location metadata)

Unfortunately, while I was working on it on the MARC train Wednesday morning, the videos were pulled off of YouTube, making the metadata no longer available.

Luckily, I had pulled metadata for one of the videos in order to figure out the best way to handle parsing the XML data returned by YouTube’s API.  I was able to preserve the metadata of one video: video010 know this moses hand.  I’ve included an attached copy of that metadata here: video010 know this moses hand Youtube Metadata.

This file includes a published tag which contains a ISO8601 format UTC datetime value.  This was the information I was trying to retrieve for Goal 1 above, but was trying to automate it for all 54 videos at once.  Goal 2 will unfortunately go unanswered for now as the copies of the videos I retrieved were transcoded to a different format in the process of being extracted from YouTube, destroying any associated metadata.

Here’s what I found:

<published>2014-08-29T14:13:59.000Z</published>

Since this is in UTC (Zulu) time zone, it must be converted back to Eastern Standard Time in order to be useful.  Currently, EST EDT (edit: thanks to an anonymous comment pointing out my mistake here) timezone stands 4 hours behind UTC.  Confirmed using this converter.

Out pops: 2014-08-29T10:13:59-04:00 — which is Friday, August 29th at 10:13AM.  During typical business hours.

I’m unsure of Mr. Conaway’s vacation schedule or typical work hours (I imagine city government workers probably have as flexible schedules as do federal).  But I think this is enough evidence, in conjunction with the questionable location of filming, to prompt an investigation by the City’s Comptrollers Office.  To paraphrase Stephanie Rawlings-Blake’s comment on the videos, it would certainly seem to be the first “appropriate action” the Comptroller should take, and Mr. Conaway’s fellow Delegates should probably take notice.

In the course of that hypothetical investigation, and I’m unsure if it is possible, if Mr. Conaway is required to provide either access to the original recordings or access to his YouTube account, I would be happy to volunteer my services to analyze any and all available data.

If any of you are wondering how to verify the metadata I’ve provided, it would require Mr. Conaway to unhide his videos (if he hasn’t wholesale deleted them — then we’re out of luck) in order for metadata to be probed once again on YouTube, though I’m unsure if making them public again will reset the “published” metadata to the date they were again made public.

I hope I’ve documented my process with sufficient detail for it to pass muster by anyone else who wishes to peer review this analysis.  If I am mistaken anywhere, please let me know in the comments and I’d be happy to update this article, with attribution to whomever points to any flaws.

The video in question is embedded below:

Conaway Update

The Baltimore Sun caught wind of his YouTube page and Mr. Conaway has taken down the videos.

I’m including them here because I am critical of his ability to effectively hold public office and the videos are my evidence of this concern. Under the Fair Use Doctrine, I’ve included mirrors of all of the videos for download. They speak volumes as a critique of his mental state.  I consider it necessary to include all in their entirety to show the breadth and scale of…this:  (embedded OneDrive share is slow, direct link is here)

The Tale of Frank M. Conaway, Jr: Author

Please note this post was originally entitled “Seriously Baltimore?”.

Frank M. Conaway, Jr. is a current lawmaker in the Maryland House of Delegates, automatically starting a new term since there is no challenger for the November election.  Mr. Conaway is the son of former Delegate Frank Conaway, Sr.  He made it through the Democratic primary earlier this year despite not doing any apparent campaigning.  While only an anecdote, there were no campaign signs visible at any polling places I visited or drove past leading up the primary.  Who needs contributions to spend on advertising when you have dynastic name recognition thanks to your dad?

I recently had the displeasure of coming across Frank M. Conaway, Jr.’s literary works which he has apparently been writing for the last 13 years and as recently as earlier this summer.  Thanks to Adam Meister for the first introduction to them.  They appear to include the following titles available on Amazon (source of the publishing dates) or through his new website:

  • Baptist Gnostic Christian Eubonic Kundalinion Spiritual Ki Do Hermeneutic Metaphysics: The Word: Hermeneutics (2001)
  • THE 20 PENNIES A DAY DIET PLAN (2012)
    • Aside: don’t be fooled, this largely involves the same concepts as the other books, and I stumbled across some food reviews in the YouTube videos mentioned below.
  • Trapezium Giza Pyramid Artificial Black Hole Theory (2013)
  • Christian Kundalini Science- Proof of the Soul- Cryptogram Solution of Egyptian Stela 55001- & Opening the Hood of Ra (2014)
    • If you’re wondering what Stela 55001 is, it is yet another typo and appears to reference 55001 Papyrus (regarded as the worlds first men’s magazine as it depicts erotica) which is in fact not a stela (or stone slab) at all, which is what makes it a unique artifact from Egypt (no other scroll paintings are known to have survived).

Mr. Conaway’s House of Delegates biography page lists him as an “Author”, while claiming no authorship to any specific works, so presumably this is the same Frank M. Conaway, Jr.  In the course of researching this post, I have confirmed this assumption thanks to his consistent misspellings, and also uncovered approximately 54 videos / rambling rants posted in the last month on YouTube under user account 314meta.  They are clearly the Frank M. Conaway Jr. in question.

Have a look below of his work entitled “Baptist Gnostic Christian Eubonic Kundalinion Spiritual Ki Do Hermeneutic Metaphysics: The Word: Hermeneutics” (yes, that’s actually the title) :

 

Upon starting to attempt to read his work, it immediately becomes evident that no one except perhaps Mr. Conaway  himself has ever read this book, much less an editor or publisher.  It is littered with grammatical and spelling errors and contains a delightful pseudo-bibliography at the end (“THE KEYS”) which, also littered with misspellings.  In an attempt to “unlock” Mr. Conaway’s writing with “THE KEYS” provided, I attempted to browse to the only website listed which appears to be an article discussing religion and the brain.  Unless this is another one of Mr. Conaway’s typographical errors, the domain “msnbo.com” does not exist and a cursory search of The Wayback Machine Internet Archive provided by Archive.org indicates the domain has never hosted anything more than a placeholder page.  I assume this was originally an MSNBC article which doesn’t seem to have been captured by The Wayback Machine.

I’m relatively new to Baltimore, having moved here in 2011 from Washington, D.C. so I had to do a bit more research to get a complete picture of Mr. Conaway in order to put his writings in context.  The City Paper was a great help, detailing Mr. Conaway’s notable baggage in an article from 2006.  The aforementioned article references Mr. Conaway’s 2003 separation from his wife in which she received a protective order against him and swore under oath that he was mentally ill, suffering from bi-polar disorder, and had stopped taking his medication.  Mind you, this is prior to Mr. Conaway’s first term as delegate beginning in 2007.

I do not take mental illness lightly, nor do I claim that I am a psychologist.  I want to make it clear that I do not mean to disparage Mr. Conaway for having a condition; however, when that condition may inhibit a public servant from performing one’s duties which inherently require clear thinking and the ability to discern logic from fallacious argument or complete nonsensical delusions, that is where I feel obligated to speak up.

Writing is the manifestation of consolidated thought — the ability to coherently put together and communicate ideas and arguments through written word.  Wildly disorganized writing which appears to communicate sincere belief in profoundly bizarre concepts, as illustrated by the content of Mr. Conaway’s literary works I find truly and genuinely disturbing as the work product of an elected official.

Mr. Conaway’s books detail his ability to interpret biblical scripture, including deciphering hidden messages in various biblical verses and Egyptian art.  His writings he passes off as “science” (in his “talking horse” titled video) as well, which is clearly a delusion.  John Nash, the very real Princeton professor whom Russell Crowe portrayed in A Beautiful Mind, had similarly attempted (with perhaps less self-perceived success) to find messages in numbers and equations.  Nash, a brilliant but afflicted mathematician, was diagnosed with schizophrenia, but not before it had unfortunately destroyed his marriage.  The parallels in Mr. Conaway’s life, based on the details provided by the City Paper article, closely mirror that of Nash.  The article above also notes that schizophrenia typically presents earlier than bi-polar, with which Mr. Conaway is said to have been diagnosed.  Given the creation or at least upload of 54 videos in the last month, I don’t think it is a stretch of the imagination that Mr. Conaway could be suffering from a manic episode, if his bi-polar diagnosis is correct.

While John Nash was integral in the development and advancement of game and number theory, Mr. Conaway’s weak resume and lack of accomplishments short of riding the coattails of his father’s name to election and now re-election, stand in obvious contrast.  The ability for Mr. Conaway to draw the most votes of any candidate in the general election of 2006 is inexplicable other than name recognition.  Unfortunately, more than a glancing look at this candidate show an inability to commit to jobs for particularly long, pursuit of foolhardy businesses (like his replica car business), and presumably difficulty following through an entire bachelors degree given the separation of his his years of study at three different universities.  This would be consistent with bi-polar as described here, and I would like to especially put emphasis on NIMH’s study discussed in this article which indicates an average worker with bipolar disorder misses 65.5 workdays on average, which I find to be unacceptable for an elected representative serving in the Maryland House of Delegates which is only in session from January to May.

Baltimore City appears to be continuing to elect the politicians it deserves (of questionable skills, questionable character surrounding themselves by convicted drug dealers in a town where an estimated 10% of the population is addicted to heroin, and questionable mental faculties) but not the one it needs (someone with character and high ethical standards, clear thinking, and unwilling to dodge questions), as is evidenced by the continued decay of many parts of the City.

Perhaps I’m suffering from “The Hubris of the Defeated“, to borrow from A Beautiful Mind, but anyone who doesn’t think that the political game in Baltimore isn’t flawed isn’t paying attention.  Mr. Conaway just happens to be a remarkable manifestation of how broken the local political scene is.

 

 

 

Appalachian Trail Section Hike 2014 – Day 1

This past weekend I took a few days to hike part of the AT through Pennsylvania.  I picked up where I left off a year or two ago after hiking 60 miles in 3 days from Harper’s Ferry, through all of Maryland, and about 17 miles into PA.

Day 1: PA Rt. 30  / Caledonia State Park

I caught  a ride with a friend early on Saturday up to where I left off.  The first day out was a bit foggy and on the cooler side.  I decided to push a little under 17 miles from Rt. 30 to the Tom’s Run shelter.  During the trek, I passed one of the nicest looking shelters on the trail!  The Quarry Gap shelter is about 2.7 miles from the park, is right on the trail, and impeccably taken care of.  It had a bear-proof metal food locker and even hanging plants with flowers in them!  Pretty amazing.  I was only passing through though, but made sure to sign the log book.

The next shelter I came across was the Birch Run shelter, where I ran into two older hikers who were out for the weekend and planning to stay there for the night.  I stopped in, had lunch and chatted with them before heading on my way to Tom’s Run.

1011275_10100488129112217_6180437225104990444_n

When I arrived at Tom’s Run it was still relatively early, so I started unpacking and getting my dinner ready.  There was another person wandering around the camp sites metal detecting — not sure if he found anything.  A few more folks showed up and I chatted with another older gentleman who was out for a long weekend — we shot the shit about gear, from hammocks (he had made his own) to the JetBoil, etc.  Most of the hikers I encountered during the hike were either out for the long weekend or where southbound thru-hikers — very few people were doing significant section hikes northbound like me.

After eating, I promptly passed out before the sun even went down!  It ended up being a pretty chilly night with temperatures dropping into the mid-30’s, so I was extremely happy to have bought a 20 degree sleeping bag prior to this hike!

Overall, it was a good first day — the rain held off and the hiking was relatively smooth-going.

Fun With FOIA & “The Free State” of Maryland.

Background

The corridor of 295 and 95 between Baltimore City and BWI Thurgood Marshall Airport is used by thousands of commuters on a daily basis.  Anecdotally, it seems the average speed is at least 65MPH and frequently I’ve found myself doing 80-85MPH in order to keep up with traffic.

Except the speed limit is 55MPH.  55.  On 95, which is, at minimum, 3 lanes in each direction.  On 295, which is minimum 2 lanes in each direction.

To give a comparison, Rt. 64 in Virginia is two lanes in each direction and has a speed limit of 70MPH, reflecting the speed that normal, safe, drivers actually drive on the road.  According to the National Motorists Association speed limits should be set to the 85th percentile of free-flowing traffic.  That is, 85% of traffic is going slower than this speed.

Of course, objectively setting speed limits based on data requires 1) Collecting the data in the first place and 2) being willing to give up a source of revenue for the state.

My intention was to find out if the speed limit is artificially low in this area, and that it could lead to the State exploiting those it supposedly serves by dipping into their wallets with no rational basis.

FOIA Request

I’ve attached my original FOIA request to the state of Maryland for the raw data they’ve captured involving volume and speed data.  Typically, this is done with the boxes you’ll see on the side of the road with two air-filled rubber strips crossing the road.   When cars hit the strip, the puff of air triggers a volume count, and based on the timing of the two strips being hit, one can gauge speed.

The original rationale I gave was in relation to the casino being built, though that was not my true reason for the request, I just thought it was mundane enough and considered that rationale for FOIA requests typically doesn’t matter.  Somehow I don’t trust the state government to provide me with data if I tell them I intend to use it in ways that have the potential to make them look bad.  It’s understandable — everyone, government included, acts in their own self-interest.

First Denial

I’ve also attached the original denial to my request (MD295 FOIA Response), which references several exemptions indicating (surprise!) the state does not have to produce the data for me.

However, the rationale given, and presumably this was written by a lawyer, are incoherent at best, given my request and the actual text of the referenced statues.

Basically, they claimed that “some of the data” were compiled into intra- and inter-agency memos which are not available via FOIA if they are not available to private party litigation or within the interest of the public.  Except you’ll note my original request did not ask for memos or associated materials, but simply the raw traffic (volume and speed) data.  So off the bat, this was irrelevant.  Further, the statement that “some of” the request was compiled into memoranda implies that there are some other records that are in fact, not compiled into memoranda, yet were still not produced.  Presumably, they failed to release the “other” records because of the following:

The exemption referenced that producing this information for me would violate a US statute stating that highway safety data cannot be produced in discovery for “…any action arising from an occurrence…” on a highway, but more broadly related to highway safety and limiting government liability.  Well, great, because generally FOIA isn’t a legal action arising from an occurrence and is specifically separate from discovery, though it can be used similarly.  So again, this is irrelevant.

There was an associated Supreme Court case which had the incredible holding that suppressing data collection efforts actually serves the public interest!  The idea is that officials don’t feel accountable to constituents which means they are able to adhere to better data collection standards.  Quite frankly, I find this argument bordering on the absurd — we peer review scientific studies where the entire process is open, including data collection standards, so others can point out where deficiencies lie or where methodology is flaw.  The fact that we don’t for government statistics is absolutely beyond me.  The Supreme Court’s argument cannot even be validated or verified since the data collected in secrecy cannot be reviewed for accuracy or methodology flaws.

Furthermore, I don’t even have standing to initiate legal action on the stretch of road — I have no accidents there nor speeding tickets, hence there is no risk of me using the data for such purposes.  Even if I did have standing though, releasing the data still wouldn’t violate the code, because it states that the data would be inadmissible as evidence during legal action anyway.  So there is no harm in releasing the data.  Period.

Next Steps

I’m currently sending in the attached response letter (FOIADenialChallenge).  I intend to pursue judicial remedy if I am again denied, up to and including record custodian reprimand, damages, and fees, since this request was clearly improperly denied.

I also amended my request to include data collection methodologies including what devices and data scrubbing are performed on traffic volume and speed data.  Don’t want to give me the data?  Then I’ll do a meta-analysis on how it is collected to inspect any flaws that may exist there.

Do I expect the state of Maryland to respond with the data I requested?  No.  They have no incentive to do so — in fact, it makes more sense for them to waste as much of my time as possible.  I have a day job.  I can’t dedicated a ton of time to this pursuit, which unfairly tilts the playing field towards the government, yet again.

Ultimately, I expect to have to take this for judicial review, in front of a judge and make my arguments, something I’d prefer not to do as I am not a lawyer, and I abhor the protocol involved in court proceedings.   However, it’s something I’m willing to do if it means I get to stick it to the state of Maryland and ratchet back the expanse of the State in general one tiny bit, simply by using logical analysis.

Once I receive the data, or in the absence of receiving it, I will be writing a Baltimore Sun editorial either about my findings within the data (if provided), or about the unfairness of the FOIA process.  In either case, I intend to name and shame all officials involved in the process.  People generally only respond to incentives, and embarrassment is a fairly powerful incentive.  They may be immune to providing me data collected on the tax payers dime, but they are not immune from me publicizing the shady proceedings involved in extracting the fruits of my tax dollars from bureaucracy.

Time will tell whether there is any remnant of Maryland’s motto left in state government.  Are we really still “The Free State”?  Updates to come once a further response is received.

Throwback: Video Recording Hack on the Palm Pre

This is a throwback post, but I never documented my experience regarding hacking the Palm Pre.  To the best of my knowledge, this is how things went down.

Background

Back when the Palm Pre was released, it wasn’t quite a finished product.  Yes, WebOS was a pretty awesome OS at the time, and the hardware keyboard was a nice touch.  Except critical functionality, besides a shortage of apps developed for WebOS, was missing from the phone.

One of the biggest complaints was the lack of video recording for the Pre’s built-in camera.  I know many people were looking into a homebrew solution for this, so I thought I’d start doing some research and see if I could contribute.

Plan of Attack

The first step I took was to research the hardware involved.  It turns out that the CCD camera chip used in the Pre was the same as the one used in the OLPC (XO laptop).  Knowing that the OLPC offered video chat capabilities, I started researching how those were implemented.

I had already rooted my Pre and had command line access.  The next step was seeing what software was already loaded onto the Pre.

In one example for the XO laptop, Gstreamer was used to pull video from the camera.  I lucked out and realized that Gstreamer was already installed on the Pre, including a number of encoders /decoders muxers/demuxers and video and audio sinks.  It was just a matter of hoping the encoders were actually implemented properly, and that I could actually pipe data through them, and ultimately to the file system.

Results!

After a great deal of dabbling at the command line, I was able to pipe raw H.264 video to the file system, a file which was then playable in VLC despite having no file container surrounding it properly:

(make sure FS is rw)
mount -o remount,rw /
gst-launch camsrc ! palmvideoencoder ! filesink location=/media/internal/downloads/foo.mp4
mount -o remount,ro /

A wiki post detailing my original hack is located at WebOS Internals.  I’ve also attached the first successful video (foo) that I was able to record — it was like 4AM and me sitting at my desk in the dark, hoping that the hack session would pay off!  Ultimately, it did.

I’m writing this article years after the fact, but in 2013 I actually ran into a guy and we were talking about the Pre, and I had mentioned I was the guy who performed the original video hack and he remembered it.  Kind of cool to have a little geek street cred!  The original Pre Central post on the topic is located here.

ClosedXML: To Equal or not To Equal?

The Problem

After utilizing the ClosedXML library for Excel parsing, my fellow developer, Sean Killeen and I hit a snag:

We were using FluentValidation with custom validators to test records in an Excel worksheet.  Except, whenever an error was triggered, instead of returning the validation failures, it was throwing an exception regarding some functionality we weren’t even using; it was throwing about XLColor or some other object irrelevant to what were validating.  Weird.

Debugging and dropping a break point in the code showed that everything was working as expected, even up to the return statement!  So one morning I declared my intent to solve the bug once and for all.

I realized the issue must be with serializing the response object — if everything is good up until that point, something is going wrong in the serializer.  So I got the source to Newtonsoft JSON serializer and started stepping through.

Aha!  Obviously when an object gets serialized to JSON, the serializer must recursively go through each child object.  It turns out, Newtonsoft checks if the object is already in a collection of objects that are already serialized.  This requires it to use the contains method of the collection, which in turn calls equals on every object within.

The way FluentValidation works is that it will return to you the attempted value that failed validation.  Upon trying to serialize the cell that failed the validation, it was serializing all the properties of the XLCell, hence why the XLColor object was attempting serialization.

Turns out, ClosedXML doesn’t implement .Equals correctly for most of its classes.  That is, if you perform the check on an object of a different type, it throws rather than simply returning false.  I confirmed the expected behavior of built-in complex objects in C# should be to return false, not cause a failure.  When the serializer would attempt to call contains on the XLColor object it was being compared to objects of other types already in the collection, causing it to throw.

The Solution

So I checked out their repository and started making updates, plus unit test coverage on each class to confirm proper functionality.  Unfortunately, my laptop croaked shortly thereafter so I haven’t had a chance to check-in my changes and open source contribution to the project yet.  Perhaps when I have a little downtime soon.

Our stop-gap solution was to pass the serializer a filter to ignore the AttemptedValue property of the FluentValidation failure, since we weren’t using it for anything in this specific scenario anyway.

Conclusion

While this was a headache that required delving into several other codebases, it was a good reminder that something as trivial as the implementation of .Equals should be paid attention to by any developer, and that a unit test to cover that scenario is not necessary a trivial case that can be ignored.

 

On Technical Recruiting

For the last four months or so my company has dedicated time to recruiting a new senior developer in order to build out our development team from two people. Being in the D.C. Metro area, I figured this would be a relatively simple task: there are a ton of people in the area, many of whom are working on technical projects for government contractors or for the government direction. A permanent position at a private company with excellent benefits should carry mass appeal.

So now, four months, ~20 in-office interviews, and after countless pre-screening exams and phone interviews, we’re still without a new Senior .NET Developer.

Background

I started at the company as a semi-technical consultant, primarily focusing on client work, back in Fall of 2009. After being at the company a few months, I started taking a look at our (at the time) SharePoint 2007-based Claims Administration Platform. It was originally built for the company by an outside consulting firm and was basically a way for us to disseminate complaints to our clients and have them acknowledge receipt of the complaints. It was largely built using BDC web parts, a little bit of custom code for uploading documents and interfacing with our existing SQL databases, as well as using a custom SQL membership and role provider.

Fast forward to today, and we’ve expanded that platform to offer more information about claimants to our clients, including custom workflow functionality that suits their needs. It basically went from a glorified document acceptance system, to a full blown claims administration platform, and we also expanded development into adjacent product areas including insurance coverage.  We’re in the process of transitioning a few clients that remain on the SharePoint system to the re-written platform that is a 5-tier architecture utilizing MVC, WebAPI, KnockoutJS, and OrmLite for the data access layer.

My background with respect to programming goes back to playing with VB back on my dad’s old Gateway 2000 486 (with Turbo button and math co-processor, natch) running Windows 3.1 back in elementary school.  Throughout middle school and high school I continued to take an interest in development, taking whatever classes were available and teaching myself a lot as well. I ended up majoring in Business with a minor in Computer Science, and most of my internships were technical, but I never really worked in a formal development environment. That is to say, I wouldn’t consider myself professionally trained as a web developer.

Our Process

We generally like to do a phone screening to gauge how the individual would fit with the company (we’re relatively young, small, and not rigid — we need self-motivated people, not to hear “that’s not part of my job”). We like to get a narrative on why they’re looking for new work, ask questions about any gaps in their resume, and talk turkey on their past projects to get an idea of how experienced they are.

If the phone interview is satisfactory, we have them do a brief, 5-question online exam on InterviewZen. A side note: InterviewZen, though not polished, is a pretty neat tool: it allows you to create interview questions, has syntax highlighting for a number of languages, and allows you to “replay” the candidates’ responses in real-time in order to see how long it took them to answer a question, as well as whether or not a copied and pasted code in order to complete it. Pretty nice way to make sure everyone’s being honest.

We ask some basic logic questions, C# debugging, SQL, as well as asking the candidate to write an AJAX call to some MVC and WebAPI methods that we provide. Generally basic stuff that any senior developer candidate should be able to do in their sleep. We aren’t looking for absolute perfection necessarily, but you know when you see a red flag.

If the candidate passes the pre-exam, which weeds out probably 70% of candidates, then we bring them into the office for a coding challenge, group interview with the development team (generally non-technical), and then for interviews with management.

For our coding exam, we ask the candidate to write a simple MVC-based web application in approximately 2-3 hours. Without giving away what we usually task people with, it should consist of about 3 AJAX calls, and have one model, one view, and 1-2 controllers (depending on whether they want to use WebAPI). We only demand that it not do full-page refreshes (i.e. use AJAX) and that data persists (but we don’t require a database — a static list of data, or storing it in the Session would be sufficient for this purpose).

The candidates are welcome to use whatever resources they want (let’s be serious, we all Google sometimes) but only require that they don’t reference an already-completed project of a similar variety. They’re welcome to use any client-side libraries they want, including MVVM. It doesn’t have to be a straight MVC solution.

The Reality

We’ve had literally no success in finding someone who can competently complete (or even mostly complete) the in-office coding challenge. As a sanity check, both myself and our former other senior developer performed the task and were able to complete it in about an hour, give or take 15 minutes.

Our salary range is on par with the industry in the area, and our recruiters know that we’re not limiting on the high end either — it’s basically just a benchmark range.  It’s discouraging because we had a guy come in demanding over six-figures, who clearly hadn’t done much programming — he bound his view to a class which had a second non-static initializer class in the same file, yet he was trying to access said second non-static class.  Like we’re seriously going to pay someone six-figures for that kind of code?  You. Have. Got. To. Be. Kidding.

Issues Experienced

  • Data Persistence: Despite the suggestion to use a non-database persistence mechanism (we even tell them to use a static list or the Session) many people were not able to get data to persist at all, which clearly demonstrates a misunderstanding of relatively basic web development functionality.
  • JavaScript: Most people are remarkably weak on the JavaScript side of their skills. I mean literally Googling “javascript function syntax” after telling me they’ve used jQuery to perform at least hiding/showing elements. We’ve had people clearly look up jQuery ID selector syntax — for ID selectors. How can you not know how to use that if you claim you have experience?
    • Inline javascript is remarkably common. It’s not a deal breaker for us, but it literally takes 2 seconds to drop a script tag on the page to reference another JS file, and let’s be serious, that’s how things should be done so I can actually minify (and eventually unit test) our JS source.
    • No JavaScript namespacing. Literally, only one person has used non-global functions and variables.  Bleh.
  • User Interface: Really messed up and inefficient user interfaces, if they even got to that point.  Some people didn’t even write a lick of HTML in 2 hours.  Others had wonky multi-page layouts for something that is supposed to be incredibly simple, and on a single page.
  • MVC Fails: Models bound to views that are for a single list item, rather than binding the view to a collection of said models, when multiple are supposed to be shown. How do you show more than one item in a list, ever, when setup like that? Sure, you could do an AJAX call on document ready to pull all of them, but no one actually did that, meaning that they clearly just didn’t understand MVC and how to payload a initial list of items.
    • Models that don’t have fields for the requisite information being captured.  If I need to mark some item as completed, shouldn’t it have a boolean to store that information?
  • What do you mean, “server-side”?: Pages that literally have no server-side code written and that simply manipulate the DOM manually.  Clearly, the requirement of data persistence and making AJAX calls was entirely lost on these folks.

Conclusion

We’ve been working with a number of recruiters, as well as putting up job postings on Stack Overflow Careers. I can say with certainty that the idea that someone non-technical can place a technical person efficiently into a technical position is a flawed idea.

The candidate pool we’ve had has just been attrocious and it makes me worry for the industry and the country. If we can’t even place people into “knowledge economy” jobs anymore, it’s not really surprising that America is falling in global competitiveness. That said, we even had some foreign workers with visas come through and fail catastrophically at our recruitment process as well.

I basically have no faith in resumes or recruiters at this point. I’m hoping to utilize Betamore’s co-working space in Baltimore as a resource for honing our recruitment process further, or possibly even finding a developer who currently works out of their space.

The thing that just astounds me is the fact that my degree is not particularly technical, nor was our other senior developers undergrad experience, yet we’re far more capable out of the box than 99% of the market. It’s just crazy to me. Perhaps its just the nature of hiring people who are out of government positions where they might be in a huge pool of developers and are able to hide their incompetence, or have less-strict deadlines than private industry requires, making the “get shit done” side of things less critical to being considered successful.

We’ll keep on trucking…I will not settle for hiring someone less experienced than me, primarily because I need, at minimum, a peer to bounce ideas off of, and ideally someone who knows a lot more than me.  That’s the best way to effectively and organically grow a team and it is something on which I will simply not compromise.