Geeks are People Too

This is Going to Hurt You More than Me

Greetings from the ending of a self-imposed blogging silence: I got the aforementioned email and am happy to state that I will shortly be joining Microsoft.  Sur La Table was very diverting and offered many challenges with respect to data, but it’s hard to pass up an opportunity to work in, and with, big data.

As a result of that interview loop, plus some interviews I did for an open position we have at Sur La Table, I’m here to write something Very Important: Don’t Lie on Your Resume.

Typically when I am called in to conduct a technical interview, I read the candidate’s resume, and then ask the hiring manager how technical they want me to get. If it’s me, and I’m hiring for a developer, I’m going to get very technical, and you’re going to spend 100% of your time with me at the whiteboard. If it’s for someone else, and I’m hiring for say, a PM, or a QC, or technically-minded-but-not-otherwise-a-developer role, I’m still going to test you on skills you state in your resume.

So when you tell me that you have a lot of experience with SQL, or that you’ve been using SQL for five or six years, I’m going to run you through the basics. Either of those statements will tell me that you know the four major joins, you know the simplest way to avoid a Cartesian product, you know how to create data filtration in a join or in a where statement, and you know how to subquery. I’m not even getting to more advanced parts like transactions with rollbacks, while loops, or indexing — the aforementioned list are what I would characterize as basic, everyday SQL use.

Imagine my dismay, then, as an interviewer, when after declaring (either verbally or on your resume) that you are a SQL expert, you can’t name the joins. Or describe them. Or (worse) describe them incorrectly. When you say you know SQL, and then prove that you don’t, it makes me wonder what else is on your resume that you “know”, that is less hard to prove (in the interview) that you don’t. The default assumption, for the protection of the company, is that your entire resume is a raft of lies. It’s the surest way to earn a “no hire”.

It would have been far better to state the truth: someone else wrote SQL scripts for you, told you what they did, and you were adept enough to figure out when there was a disparity in the output. That does not mean you “know” SQL, it means you know how to run a SQL script. This gives the interviewer an honest window and the ability to tailor your time together (remember, they’re getting paid by the company to spend time with you, if it’s productive it is not a waste of money) to figure out your strengths and weaknesses. Having just been hired into a position that works with big data, where I was honest that the largest db I have worked in and with was about 3TB, I can attest that it’s really hard to have to look a hiring manager smack in the eye and say: “I have 90% of what you have asked for but I’m missing that last 10%”. It gives them the opportunity, however, to decide if they’re going to take the chance that you can learn.

If they’ve already learned you’re honest, then that chance-taking looks better in comparison.

Advertisements

In Development

I was at a holiday gathering the other day and during the usual course of “…And what do you do?” I replied that I was a developer. The inference was that I was a Real Estate Developer; I had to explain that I was a Make the Computer Do Useful Things Developer. I was talking to two ladies about my age (Hi, I’m 40), and was surprised at the reply: “Oh, that’s unusual!”

I suppose I should not have been. I know a lot of women in IT, but darned few who do development.  To be clear: most of the women I know in the Information Technology space were at one point developers, or have a passing knowledge of some development language. They merged into Project or Product Management, or Business Analyst roles. These roles require knowing what is possible of code without actually having to write any of it, and so if you get tired of the incessant progress of development technology then that is one way up and out (and it is a way I took, about five years ago).

Careers arc and opportunities knock and itches flare up and I am once again a developer.  And I find myself, when talking to people who don’t work with or know other developers, battling not only the usual misconceptions about development, but the gender-based ones as well.

Development (in IT terms) is the handle one applies to the concept of using a series of commands (code) to tell the box (tower, laptop, server, etc.) what you want it to do; if you want it to take in something or not, if you want it to spit out something or not. In order to create this blog post many people did varying forms of development (from creating the templates that instruct the browser how to make this post look all shiny, to the protocols that tell the server where to put this post, to the widgets on the front end that tell you things like I haven’t posted in a while). If I typed it in MS Word, that required a bunch of other development by a bunch of other people.

Development is not:

  1. Something you can do on five screens drinking 3 bottles of wine to create a “worm” that appears as a graphic on your screen (as in Swordfish), and usually doesn’t involve a developer logging an Easter Egg of themselves in a bad Elvis costume with sound effects (as in Jurassic Park)*. If I drank 3 bottles of wine and was looking at 5 screens they’d probably be the ones you see in a hospital room, and the only graphics I would see appearing would be the “worm” that is my heart rate monitor flat-line.  And while I have myself buried Easter Eggs and commentary in code, it isn’t that elaborate because you don’t typically have time to build elaborate things. You’re busy rewriting all of the stuff you just wrote because someone decided to change the scope of your work.
  2. Anything involving a graphic user interface (GUI). When a developer talks about manipulating objects, they are things that are typed out phrases, they are not boxes that are dragged and dropped. There are some development environments that offer up a GUI in tandem with the “scripting” – that bit about writing out words I was talking about – but they are there to illustrate what you have scripted more often than not, and not there to assist in your scripting.
  3. Finite. Development technology is constantly changing and no one developer knows all of the development methods or languages. That would be like someone knowing all of the spoken languages in the world. Rather, it’s typical you’ll find one developer who “speaks” one development language really well, or maybe a branch of languages (much like you run into a person who can speak Spanish and French and Italian, because they are rooted in the same “base” of Latin, it’s not uncommon to find someone who can code in ASP.Net and VB.Net and C#.Net, because they’re all of the Microsoftian .Net base).  No one hires “a developer”, they hire a .Net Developer or a Java Developer or a Ruby Developer or what have you. Specialization exists because the base is so broad.

Modern cinema has done an injustice to developers in terms of making what we do seem both simple and sexy; the “shiny” environments typified by the interfaces “hackers” use on-screen looks really slick and probably took some real developer hours of time to make look good… with absolutely no real purpose. That said, actual development can be simple (with clear requirements and a decent knowledge of the things you can and can’t do) and can be quite sexy (if you’re sapiosexual). It’s just not well-translated in current media. (To wit: Jeff Goldblum uploaded a Virus to an alien system on a Macbook. He didn’t have to know the alien system’s base language, machinery, indexes, program constraints, functions, etc. And it was on a Mac, in the 90’s, for which development was not one of its strengths).

Most of what development is, is trying to solve a problem (or two), and generating endless logic loops and frustrations along the way. You build a “thing”, you think it works, you go to compile it or make it run, it fails, you go dig through what you wrote, find you’re missing a “;” or a “,” or an “END” or a “GO” or a “}”, re-run, find it fails, and go dig through some more. For every hour you spend writing out what you want it to do, you spend about an hour figuring out why it won’t do it.  This process of “expected failure” is not sexy or shiny or ideal, and that’s why it doesn’t show up on-screen.

These are misconceptions every developer, regardless of gender, has had to deal with at some point. Some deign to explain, some gloss over, some simply ignore; much like I really hope we get a socially-functioning, intelligent person on-screen soon, so do I hope that we get a showcase for the simple elegance of real development.

It would be great, too, if there were more female developers on “display” as well (and not for their bodies, hence the scare quotes).  Think through every movie you’ve ever seen that shows people doing any real development, “hacking” even (a term that is abused beyond recognition); how many were female? Go back to the movie “Hackers”—did Angelina Jolie actually, ever, really type anything? You inferred that she did, but the real development, the real “hacking”, was done by the crew-of-guys. Oh, and that’s right, she was the only girl.  The Matrix? Carrie Ann Moss spent precious little time in front of a computer there. She did look damn good in skin-tight leather.

Fast-forward a decade (or two) and we’re pretty much in the same boat. You see women behind computers on-screen, but they are typing in word processing programs or moving the mouse to click it on the shiny picture of the Murderer/Prospective Boyfriend (or, you know, both). They aren’t buried under a desk trying to trace a network cable or eyeballing multicolored text trying to figure out *WHY* it won’t compile, they’re delivering the shiny printout to the Chief/Doctor/Editor from which Decisions Will Be Made.

We find it surprising in social circles, I suppose, for women to be in development, because we don’t see it exemplified or displayed in any of our mediums.  TV, Movies, even proto-development toys for children often feature eager-looking boys interacting with them, the girls are reserved for the beading kits and temporary tattoo sets (actually, there’s precious little out there for getting your child, regardless of gender, to learn code, but that is changing). We have crime-solving anthropologists, we have NCIS ass-kickers, we have cops and coroners;  maybe it’s time we had a developer.

*Jurassic Park is a good example of both great and poor development display. Right before tripping that “Dennis Nedry Elvis Graphic”, Samuel L. Jackson’s character is eyeballing Nedry’s code. That stuff that looks like sentences that don’t make sense? That’s code. That’s what it looks like, for the most part. Unfortunately, later on when the little girl is hacking the “Unix System” that “she knows”, it’s all graphical. And that’s not accurate.

Plus One To Awareness

Yesterday 10pm local time ended my 24-hour vacation from any sort of connectivity (including the ability to “google” anything, text anyone, etc.) If you think it’s simple, try it in a place as connectivity-savvy as the Magic Kingdom. There’s an app to navigate the kingdom that includes line times, parade routes and hidden Mickeys. I couldn’t download or use that, no phone. There’s free wi-fi in the hotel and in the parks. Nope. In a line for Space Mountain where every 3rd person is lit from beneath (thanks to their iPhones and in a couple of cases, iPads), connectivity sure would provide an answer to the waiting game.

When I turned my phone off I made an analog list (pen, paper) of all the things I’d use connectivity for if I had the ability to, and the time.

  • At 11pm that night, finding it difficult to fall asleep and devoid of reading material (I had finished it), I really wanted to read my twitter feed to fall asleep, but I didn’t.
  • At 3am I wanted to look up the symptoms of food poisoning (yes, it was), but I didn’t.
  • At 9am the male child asked if he could bring his DS into the park to keep him occupied, and when I incredulously turned to him to explain the whole park was designed to keep him occupied, and discovered that he was teasing me, I really wanted to tweet it. But I didn’t.

And on it went. In the line for Space Mountain I wanted to share the statistical correlation between a person with an iPhone and a lag in line continuity, I wanted to look up the name/number of the restaurant we are to eat at tonight, I wanted to check the terms of the Disney Visa and see if it really was the good deal it was purported to be.

But the thing that really got me was pictures. I couldn’t take pictures.

Pictures of the male child when he finally got his sword (it’s impressive), of the lush greenery that would exist just fine here without the careful maintenance it gets, but would die in two weeks outside in Washington, of the attention to detail this park gives to its art and architecture. “The floors here are *really clean*,” the male person said, as we trotted along in line at Space Mountain. (This was fortunate for the teenager in front of us who, when the line stopped, would sit down on them. Just plopped right down. Even if the line moved again, and then she’d try to scoot along on her ass. Ridiculous, naturally.) It became a challenge to find something out-of-place anywhere.

Therefore, today, fully connected, app-in-hand, there will be pictures, and tweeting, and tweeting of pictures, and Foursquare check-ins, and more pictures.

PS  – for those wondering, my personal email for a 24-hour period counted 74 including advertisements, and 2 for legitimate communications. My work email counted 14, of which 8 were things that were not about me and completely resolved before I got online, 2 were social (one going away notice, one lunch notice), a meeting change notification, and 3 legitimate to the project I was working on.

PPS — Grog the Luddite would like to mention he’s really a sensitive, un-macho, really into stopping and smelling the roses guy and likes technology just fine and even knows a thing or to about it, he just wanted me to realize that there was life outside of it. Point taken.

Dabble, Dabble, Toil and Babble

“Your biggest problem”, he stated flatly, “is you’re a dabbler. You don’t specialize in anything. You are not going to succeed because you do not focus on a given talent; you just dabble in this and that.”

This was actually stated, to me, in a 1:1 with my boss at the time. He was a financial services guru and I was his personal and executive assistant, so assigned because I was technically inclined and could type fast. In short, I was good enough to be his e&pa because I dabbled.

Despite initial reaction, this was meant to be a positive speech: it was going to Incite Me To Action and I was going to Make Something Of Myself. Instead, I quit the job, moved back home, and dabbled some more.

I dabbled my way into SQL.

Then I dabbled my way into ASP.Net. Then I dabbled into VB.Net.

Then I dabbled into SQL some more, and into project management. And the dabbling continued, through business development, communications, operations, and back into development (but C# this time).

“Which one of your degrees does this job come from?” wondered my stepmom one night in Spring when I told them I had acquired this one. “None of them!” my dad said wryly.

My old boss is correct: I am a dabbler. None of the things I have done, have I truly specialized in. There are better people at SQL out there than I am, there are certainly better people at .Net and BusDev. But there are damned few who can speak those languages and are willing to translate them, painfully, carefully into shiny PowerPoints and ROI-laden SWAT analyses.

A few months back I had my midlife crisis, it lasted 36 hours and was of the vein  of “what am I DOING with my life? Where will I go next?” And I realized that every other time in my life I’d been faced with that question things unquestionably got better, more exciting, and more rewarding.

I have friends who went to college for what they ended up being in life, they seem happy and fulfilled. I have friends who picked a field and stuck with it, and will have a decent retirement to speak for it. My own parents offer four different examples of picking a road and trotting down it come hell or high water and they’ve all done fine.

I do not believe, though, that diminishes any success by a diagonal route.

Owning Your Data

I realize I’m terribly late to this party. I’m not even fashionably late, I’m “you arrived just as the caterers were cleaning up and the hostess had taken off her shoes” late. I’ve been busy (as, I think, I’ve amply covered).

However, I really must say a word or two about Reinhart and Rogoff.

For those who don’t follow economics or kinda remember they heard about it but aren’t sure what the big hullabaloo is, I recommend you google it; look for the Economist, the Guardian, and the Atlantic non-editorial resources to start. There’s a few. Then you can go off to the editorials for dessert. For those who don’t want to google, here’s the Twitter version: Two economists present a work in which they suggest that there is a deep drop off in economic performance without austerity measures. Essentially they said that when debt is high, growth slows to a grinding halt; the graph they presented roughly resembled the cliffs of Dover.

And it was wrong.

Because of an Excel spreadsheet formula error.

Normally this wouldn’t be awful. Anyone, and I do mean anyone, who has used Excel to convey data (or volumes of analysis) has made that spreadsheet error, and it can be as simple as not properly conveying a Sum formula, or as complex as messing up your Vlookup in your nested IF statement. Excel has been bastardized over the years into an analytics function (by courtesy of default in that it’s on nearly every machine) that it really can’t fully accommodate without failsafes; EVERYONE makes an Excel error.

Reinhart and Rogoff’s mistake is NOT that they made a spreadsheet formula error. And, contrary to the article above I linked to, it’s only partially that they did not peer review.

It was governments’ (plural, many, varied) mistake to use it to shape policy.

Lookit, suppose I told you that, according to my Excel spreadsheet, you were very likely to die from dehydration if you didn’t eradicate all but 0.4 grams of salt per day from your diet. For perspective, the average diet has about 5 times that. You would very rightly look to other studies, other data, other sources of information. You’d poll your neighbors. You’d check with friends. You’d do your due diligence before you used my say-so, no matter how shiny my Excel spreadsheet, or even how shiny my MD would be (this is fiction, after all).  Plenty of people are told by their doctor to lose 10lbs because it will make a difference in the long run, and plenty of people seem to blithely ignore it because they don’t have corresponding (personal, attributable, anecdotal) data.

So why, why, why did any government, financial body, fiscal institution leap on the screeching panic train when R&R’s study hit?  Why did no one look to a 2nd opinion, a different study; why didn’t they check the data for themselves before subjecting their economies to the fiscal equivalent of a rectal exam?

I have been in data now for 15 years. It’s not a long time in the scheme of things, but it’s something I’m known to be passionate about. I can go on and on about how data works, or doesn’t; what you can derive from it; how data *is* integrity if done right. Any form of analytic reporting that is worth its salt has been tested, peer-reviewed, and validated against two or three other methods before it is used in a practical space. At Expedia, at one point, I managed 500 ad-hoc requests per month, and each of those was eyeballed against existing reporting and a decent sense-check before being used to cut deals (or not).

Now, please understand: R&R screwed up. And, apart from their formula error, they insist the outcome is the same (and it is, but it’s the equivalent of saying “ok it’s not a steep drop off anymore, more of a speedbump, but still it’s a delta!!”). This is the foible of the data monkey; again, something we’ve all been prey to. But not all of us have done it to the culpability of large (and small) governments, and most of us have learned to admit when we’re wrong. That is the crux of it: if no one is perfect, no data is perfect, to pretend yours is against evidence to the contrary is specious at best and negligent at worst.

I argue though that the more egregious mistake is to *follow* that data without validation. To quote Ben Kenobi: “Who’s more foolish, the fool, or the fool that follows him?”

Typing

It’s my “me” night — the boy is with his father, the man is with his brother, and I am home watching a James Bond movie. It’s “Thunderball”, released in 1965; at this time in history my father had been in the country 1 year, I do not believe he had as yet met my mother, and I was -8 years old.

All of the women are decorative, deadly, or both. Any one of them who was competent and even remotely personable was a secretary. The only two remainders were a deadly assassin (ultimately, and inevitably, poor in her job) and the clueless, innocent heroine.

When I was in 8th grade, typing was a requirement for everyone, but you had to do it on an IBM Selectric that was only slightly quieter than a beehive. Typing had time-tests as well as visual tests — you could NOT type the volume in the time if you hesitated to look at the keyboard. I had managed to multi-task and eyeball the keyboard through the first quarter, so my second quarter C’s were not welcome at home. (In point of fact, C’s were never welcome at home, but A’s that went to C’s were very much not ok). My grades came home and my parents acted.

My stepmother grabbed a sheet of blue, circle-shaped stickers. And covered every key in the keyboard of the computer my brother and I used. It was torturous. But I learned to type.

Not to become a secretary.

Seventeen years ago I took a couple of classes at the local community college to learn how to program websites — I was a “web developer” when everybody was, it founded a slightly profitable side business. In 2000 I took classes in DB development, by 2003 I had argued my way into a dev job. In 2004 I got the dream job, at Expedia, to do development in their Reporting group. By 2010 the good jobs had moved to Geneva and I had to find other pursuit. By 2013, I had tired of “other pursuit”.

Today I find myself with two keyboards, two machines, a multitude of projects and lots of things to build. I type a lot these days. But I’m not a secretary.

Sur La Awesome

My resolution to blog more often has gone by the wayside courtesy of a new job. I started working at Sur La Table about 10 (calendar) days ago (officially) and I’m having a bit of a hard time.

I’m having a hard time separating reality from all of the awesome.

Any time you start a new job, you’re going to be in a “honeymoon” period. Everything is new, and different. It’s a bit like the 4-week rule I had when I was dating. It went something like this:

Week 1: Dating again. Ok, this is cool, this is normal, everyone dates. Cool.

Week 2: He can do no wrong! He’s going to be a Doctor or Lawyer or Artist or Trashman and this totally meets with my life plans because of X/Y/Z contrived plan.

Week 3: He has a fault. It’s not a big fault, it’s a fault; everyone has faults! I’m totally not judging!

Week 4: The fault… has spawned. It has morphed into one giant gelatinous blob of fault-ness, and I can’t stand it.

(At the end of week 4 I’d dump him. He was still on week 1.)

Fully aware that I’m in week two at my new job, I’ve been doing my damnedest to be diligently down on the novelty, and… it’s just not working.

I get to *build* things again. My professional experience with C# is very, very little and very, very old, but I’m almost done building a nifty little widget complete with error handling. I’ve reaffirmed my faith in Stack Overflow, my lack of faith in MSDN, and re-verified that “Dummies” books are anything but. Half of my day is spent “managing” (two rock stars in their field, incidentally) and the other half is spent “creating”. There are two good coffee sources (NOT including those directly in-office) nearby, two Subways, and my desk has a view of Mount Rainier.

Don’t get me wrong: we’re a small shop. There’s a lot of cross-functional, “ok-you-don’t-know-it-so-can-you-build-that-into-your-estimate” expectations, a lot of last-minute, “oh by the way”. But… I get to *build* things again.

And… there are no more 5am meetings (or 6am, or 7am, or 8am). My earliest meeting is 9, most people don’t set one past 5. People show up, they work balls out, they go home. A tremendous lot gets done and while the shortcomings of the vendor/system/funding/etc. are all publicly, and explicitly, acknowledged, this somehow does not diminish the drive of the people who are involved.

We are selling kitchen supplies for the devoted chef. We are not saving lives, we are not universally accessible. But we are providing you the very best that you can get, at the very best value you can get it, with the very best, real advice you can get it with. We are trying lots of things, and we are experimenting, and we are innovating. And yes, my first paycheck will likely be contributing to my future Le Creuset collection. The real value, however, is that I get to build things again.

Even if it means I hit Stack Overflow six times a day.

Transition

Managing transition is either awesome or sucks, there doesn’t seem to be a “transitory” mood to it; either everything buttons up all sweetly or everything runs amok at the last-minute. Or so it seems.

My transition between Expedia and Sur La Table is marred by my boss’ work trip, my personal trip, and a whole host of concern over who takes what work management piece over. Not to fear, the formal plan has been (properly) vetted and communicated, now is the task of actually putting those succinct bullet points in place. For the most part they’re actually aligning nicely, so I’ll deem this transition “awesome”.

I’m very much looking forward to my new position, and a bit sad to leave Expedia, although I really do feel it was time. After nearly nine years, 8 offices, 7 countries, 6 bosses, 5 titles, 4 buildings, and 3 groups (not including a brief reorganization into Finance (?!)), it’s time see new things. And so I go from Passion One (Travel) to Passion Two (Cooking).

When I was 15 I got a job at a Dairy Queen. “Don’t worry,” they said, “after a couple of days you won’t like ice cream or fast food anymore. Everyone loses weight.”  That actually was true for me but more because the walk to and from work was a mile each way, which was certainly good for my food-centric self. I am not, nor have I ever been, known to eschew a Blizzard or a cheeseburger. Going to Sur La Table does not mean I will stop cooking, it will mean I will want to procure more cookware and do more things, and that is an exciting prospect.

Aside from the added incentive to create in the kitchen, though, is the incentive that I will be creating product again — specifically technology product. I’ll be running a small development team, as well as doing some dev myself, and I’m extremely excited at the prospect. I’m quite rusty in parts — although the SQL whiteboard was fun my C# skills are woefully outdated — and so the next few days will be that awkward position of cramming for the “new” job whilst handing off the old.

Transition, indeed.

The Economics of (a Minor) Failure

First, let me point out I’m safe. I am sitting in Heathrow, for the 2nd time today, waiting to get on my flight. For the 2nd time today.

Twenty minutes into flight I realized we hadn’t gone above 10,000 feet. Another minute later all cabin crew were called to the cockpit — over the PA system — and this, if you pay attention at all, and you haven’t had anything to drink and/or have a deep-seated fear of flying you totally forgot about until just the moment you hear this, will make you quietly fret. Then if you pull up the travel map on-screen and discover for the last ten minutes you’ve flown in circles, well… you’re pretty not happy.

We couldn’t pressurize. They tried everything ground crew suggested, none of it worked; so they confessed (our Captain was extraordinarily calming), and flew over the water to dump fuel (fun fact: dangerous to land a fully fueled plane, because the wings are so full of fuel). We spent 20 minutes dumping fuel that vaporized as it exited from the wings, it was both spectacular and appalling (to those of you on the east side of the English channel you may have an odd taste in the air…). Imagine a fire hose strapped to the wing of a plane (on the underside) and then turn it all…the…way…on. For twenty minutes.

After that completed we went back inland and landed.

We were handed 10GBP vouchers. For information, this purchased one tomato-and-mozarella sandwich, one bottle of water, and one glass  of wine. The flight was full (no space), and so this got me thinking about the economics of this little enterprise.

We flew a 747-400, which has a fuel capacity of 57,285 gallons and a passenger load of roughly 416 people (1) (for 3-class version, which is what I was in) but British airways uses 345 for their figure. The plane consumes 5 gallons of fuel per mile (2), at 250 knots per hour and we were up for 45 minutes. The delta between maximum takeoff weight and maximum landing weight is 240,000 pounds, which for fuel means 6.8 pounds per gallon of jet fuel, and therefore 35,294 gallons of jet fuel we had to dump. Currently, jet fuel goes to about $3.30 US as of today (3).

Including flight crew time (time starts when the door closes, for 8 crew members and 2 pilots they probably ran $800, maybe $1000 fully-loaded). I’m not going to include the passenger opportunity cost (e.g., I could’ve done something else for the hour or so this ate up), and they’re going to stick me on another flight that I do not also have to pay for, so they don’t get “credit” for the income of the ticket against the first flight. The rest of this we’ll assume is a dead weight loss.

  • Cost of the meal vouchers for passengers: 10GBP x (345-154) passengers (first class passengers were invited to the lounge for private dinner)=1,910 GBP, at today’s exchange rate is 1.55 USD to GBP, so $2960.50.
  • Cost of fuel burnt (45 flight minutes, which is 3/4 of an hour, at blended speed of 250kph (would actually be a little less, let’s call it 225)is roughly 845 gallons of fuel burnt, at $3.30/gal is $2785 in lost fuel.
  • Cost of fuel expelled: assuming they planned on their burn, they still needed to dump 35,294-845 gallons, which is 35,450 gallons (roughly) at current price is $117,000 roughly.

Total cost: $122,750 (very roughly). This sounds huge to an individual (it is) but in terms of overall expense I’d think it were a rounding error in terms of the bank of overall flights leaving Heathrow for British Airways.

There are other things here that should be flagged but are hard to quantify: costs incurred by passengers beyond their 10GBP purchase (which would be a plus to Heathrow but not British Airways), and the aforementioned opportunity costs. There’s also the plus/minus on the experience in terms of word-of-mouth — interestingly most people were jovial getting off the plane. The general feeling was one of “hey, we’re alive, and they let us know what was going on”. It’s interesting to watch people purchase items they didn’t really want to take full advantage of their free 10 quid, by the way. They’d come to the register having purchased their beer and sandwich, ask for change, realize they won’t get it, and then ask what they could get for 1.5GBP or what have you. The apostrophe here in Heathrow is doing a fair trade in bananas and nuts.

A Did Not Equal B. I Don’t Know Y, Either

Someone very dear to me told me about a year ago that I kept succeeding and succeeding at things, and one day, I was going to fail at something, and it would be interesting to see how I took it. Sad to say, that time has come. I bombed my Calculus test. (Please do not read a Perfectionist’s “I got less than an A” into this). The fact of the matter is I went IN to the midterm with a 98.5% cumulative grade in my homework assignments and discussion groups. (Yes, you can have discussion groups in Calculus. Yes, they’re about as stimulating as you may think.)

I left the midterm with a 74% in the class.

You don’t have to have taken Calculus, or anything other than some very basic Algebra, to know that I bombed the midterm. Here’s the rub: math is cumulative. So how could I get all of the homework *right*, but the test so very, very wrong?

“Taking Calculus online is probably the hardest way to learn it”, my teacher had warned us. Still, I went in feeling confident, I left the test thinking I may have gotten two (2) problems incorrect, and so the grade was a shock to me.

I withdrew from the class.

The numbers are thus: I could have stayed IN the class (I’m taking another one), been a metric stressbunny, and possibly toiled enough to bring that grade up to a B –*if* I aced the next Midterm, *if* I aced the Final. Statistically speaking that would mean one thing would have to give in my life — and since I can’t give on motherhood and work pays the bills, school had to give. I’m still taking my other class (that one still have my A, thanks, the midterm isn’t until this Wednesday — I’ll be taking it from Rome) but, given current conditions, I can’t take a class where the context is not intuitive… or at least not right now.

Many friends recommended Khan Academy, which I will likely play with as I get a little more time; but quitting and/or failing at something (it amounts to the same thing) was a huge disappointment and I didn’t take it well. It got bad enough to where I was wondering if I was having a midlife crisis, then I realized at 39 I am in fact, mid-life, and things really got ugly for a couple of hours whilst I wallowed in self-pity and the belief that I wouldn’t amount to anything.

It’s been about five days since my reality check and I am feeling better — a lot of peripheral stress died down and I realized that I can still take classes and still toddle on to the goal — just perhaps a bit slower, and without the ability to phone it in.

I took it as a sine.