point of order

Facebook: Am I Doing this Right?

I am about a month into this Facebook experiment and I’m finding it alternately interesting and a chore.

The interesting parts come from the content that my friends and family post; it’s a real variety, as I’m sure it is for everyone on Facebook. There’s pictures of the kids’ latest games or school accomplishments, laughable moments when someone paints the family dog or puts make-up on a parent; there’s work rants and engagement notices and of course the ubiquitous happy birthday notices that scroll.  There’s tirades against the tyranny, protest against the patriarchy, support for soldiers and friendly philanthropy. I see windows into hobbies (miniatures, comic books, quilts, photography), windows into travel (Spain, Japan, England, Australia), windows into houses (parties, selling-of-the-house, buying-of-the-house, the ever-popular remodeling-of-the-house — oh, and the building-of-the-house).

I de-muted a lot of folks shortly after the election because, like everyone else, I was in a bubble; however I note that I wasn’t missing the political posts so much as the non-political ones.  I can get my politics from the Economist and NPR, but the Economist and NPR can’t show me the progress my friend has made on her garden.

Which brings me to the chore: curation. What should *I* post on Facebook, to show that I’m engaged? Am I doing it right?

The concept of Facebook curation is not new and it’s been studied (particularly as to its impacts on mental health). The idea that I should/will consider what I post, the varied audience, etc. before I post means I am not being my “authentic” self and thusly am showing only the “best” side I have, therefore setting a higher standard for others.

This sounds so impressive, except that I’m pretty sure that my quest to find the very best protein powder, or inability to fire the correct muscle groups in my left butt cheek, or continual surprise at insomnia when it decides to rear its ugly head, all of which are authentic, are not my best side.  Perhaps I’m not curating correctly.

I therefore started to look through my feed to see if I could find an example of curation. I believe the point of curation is to show your very best self, so the criteria I used to identify curation was that the post itself had to be positive or show the post-er in a positive light (not neutral or negative).  The post couldn’t be commentary on a news item *unless* the poster had an accompanying lengthy position statement to demonstrate knowledge of the space.  The pictures must be flattering, if there are/were pictures.

I found three genre of possible curation: My Life is Instagram Fabulous, I Have a Lot of Friends, and I am a Positive Person.

My Life is Instagram Fabulous is the person who takes great pictures.  Either a set of three or four, or a polite collage, all framed properly and tastefully filtered or cropped.  Some of them are actual photographers so this makes sense,  and generally speaking their content is mostly photo and a little text.  Often this links to their Instagram account (I don’t yet play there but maybe I should). These are some of my more artistic friends.

The problem with pointing a finger and saying they are curating is that 1. they are expressing themselves in the medium to which they already have an affinity (these are the folks who were running around with actual film cameras back in the day and were probably the school photographer) and 2. I know them and have seen how/when the pictures get produced; yes there is forethought and planning but it’s mostly to capture the *feeling* of the moment and not to convey something artificial.

I Have a Lot of Friends is the person who seems to be permanently at parties and gatherings.  As an extroverted introvert this exhausts me but I can see they are having fun.  Usually there are large group photos, group selfies and photos of tasty-looking food and/or the theme of said party. I have more than a few green pictures right now thanks to St. Paddy’s. The pics seem to be taken early in the party (everyone fresh!) and midway (everyone having fun!) but not towards the end, which we all know is when your mascara is running a bit and your lipstick has worn off and everyone is exclaiming that they don’t usually yawn at 9:30/11/1am but they got up early that morning. (A note about the photos:  one of my friends is a beauty queen — honest to goodness, complete with the sash — and never, ever takes a bad photo. Ever.)

The problem with pointing a finger and saying they are curating is that 1. when was the last time you went to a party and took pics at the end? You didn’t. You were having too good a time, or you rationalized that you already took all of the pics and there wasn’t a point in doing more.  Secondly, the whole point of Facebook is to network among friends, so naturally events that tie two or more people together within the platform would be appropriate to post.

I am a Positive Person is the person who posts a lot of life-affirming, positive statements.  They can either be the someecard style, or the motivational-poster style. They tend to be posted in fits and spurts, leading me to believe that there is some aggregator of these things that people can pick one or more at a given time and simply share to their wall.

The problem with pointing a finger and saying they are curating is that 1. these tend to be something that everyone could benefit from (or get a laugh from), so from the “my life is more wonderful than yours” aspect — which honestly seems to be the sort of curation that is criticized — it doesn’t add up.  If you want your life to be more wonderful than mine by comparison then don’t share a great lifehack about gym prep or affixing importance to given events (don’t sweat the small stuff). Secondly, I get the impression that the Positive Person is trying to boost themselves and others as an aspect of this, and that isn’t curation so much as it is, I think, affirmation.

As I review this list and attempt to see if I am Doing It Right it occurs to me that I’ve fallen prey to survivorship bias. If we posit that “bad” curation (the kind mental health researchers are rightfully worried about) is the act of displaying only a competitive, positive slice of your life at the expense of other parts of your life — I’m thinking teen girls mostly thanks to the literature around this — I don’t have many friends (even Facebook friends) that fall prey to this. (You could argue I don’t have many friends. That may be true.)  The sample set weeded itself out before I sent (or accepted) the invite.  You could make the argument that you pick friends and don’t pick your family — but my family is the one that helped create my mindset (think lots of Nova/Nature shows, learning to balance a checkbook at 10 and do my own taxes at 14, and a severe distaste for bullshit) and so they don’t tend to share this predilection.

So I think I’ll just keep posting whatever I think is appropriate to share on Facebook — with “appropriate” defined as probably not the contents of the morning’s bowel movement or things of a similarly super-private nature — and we’ll see if someone gets jealous of my insomnia or failing gluteus minimus sinister.

 

 

 

 

Vote

I usually resist posting overtly political messages — not because I do not have opinions (boy, do I have opinions), but because I can usually find someone screaming “my” message from the top of their lungs, participating in the cacophony that runs parallel to our electoral process.

I do not pretend to have voted in every election since I was 18. I have not. I *have* however voted in every election since 2000, when I returned to Washington State and in my own self assessment became a grown up (I had voted in every Presidential election previously, but like most younger folks I had largely ignored local elections). I vote because it’s one of the freedoms we have, an ostensible say in the selection of who is going to Speak For Us, and because there are still many in the world who do not have this freedom. I also vote because I’m a firm believer that if you don’t do what you can to improve things — in any way you can, the least expensive (in time and money) of which is to vote — then you don’t get to bitch about the outcome.

Which brings me to today, Memorial Day.

Memorial Day is the day we honor those who have fallen in service to our country. Male or female, any branch of service, for hundreds of years. Some of these folks died to preserve our nation and some of them died to (purportedly) preserve similar freedoms in other nations. It’s important to remember that whether or not you agree with the reasons they were sent “over there”, they still went, they still died, and they still deserve respect for it. You can argue at the top of your lungs that you don’t agree with some of our most recent wars — and you’d be in very excellent company — but the fact of the matter is the responsibility for the Going To War is held on different shoulders than those who Go To War. Those who declare we are Going To War do so from a (hopefully) analytic mindset for the Greater Good. And those who Go To War are doing (hopefully) the best with what is given to them, be it direction, armor, or support.

That there is deficit on both sides is well-documented, maddening, and disheartening. We as constituents find out we went to war for reasons that were not as stated, or that don’t make sense, or to support an economic position, rather than a defensive one. We find out those we sent to war weren’t prepared, weren’t supported, weren’t properly supervised, mentored, and managed, and that horrible things happened to those we sent and those they were sent to protect. (The “fortunate” ones who get out, who make it back, often are equally unsupported – psychologically, medically, and financially).

This Memorial Day I have the following entreaty: Vote. It’s the simplest, easiest way to honor those who have fallen and exercise your right to pick the people who, in effect, get to select who falls next, where, and for what. And not just for the Big Ticket — vote for your members of Congress, because they’re the ones who can officially Declare War, and unofficially bring things to a grinding halt, as well we know. You may feel like this election is one of “voting against” rather than “voting for”, but at the very least you are having a say.  https://www.usa.gov/register-to-vote 

Off

Greetings from my week off.  This is what it takes to get blogging time.

I have discovered that you really and truly can over-commit yourself, but more often what actually happens is you don’t manage the commitments you have very well. When I went to take this week off — which started at 3:30pm Friday April 1st, something heralded as an April Fool’s Day joke by those that know me — I would have said “I’m over committed and I need to step back”.

Three days in and I’ve already discovered part of my problem: my phone.

In order for this week off to “work”, I had to do two things: I had to arrange for Outlook (my mail service for work) to *not* automatically open when my laptop boots (done) and I had to detach my work email from my iPhone. The last time I did the latter was my wedding week in August of 2014.

I have had the most fulfilling, relaxing yet-personally-productive, best-sleep weekend. I had no insomnia Friday, Saturday, or Sunday nights. I got a bunch of projects done around the house, I have taken time to actually thoroughly read my Economist (instead of jumping to the bits I usually read and then, if time permits, reading the rest). The best part of this is knowing that if I had had to go to work today, it would have been okay: I actually unplugged this weekend.

Here’s how this has worked historically: I use my phone the way many of us do; I have my Evernote for shopping lists and recipes etc., and my fitbit tracker, and my weather app, my stock market ticket and texting (the tether to my offspring these days). I use it for a variety of things, the least of which appears to be actually as a phone, and most prevalent is for email. Being the checklist-y, anal-retentive person I am, I really do not want to see the little red notification bubble on my mail that I have unread mail. It bothers me. It’s less clean looking. I could turn off the notification badging for email but that would be problematic during work hours (or on working days). So I roll over in the morning, check the phone and oh, there’s email: better answer that. I stop at the grocery store on my way home, and there’s email: better answer that. I pop open the laptop to get that recipe for dinner tonight and there’s email: better answer that. On weekends it would be get up, go to the gym, check in to the gym with my app and there’s email: better answer that. Stop by Home Depot, get those plants I need, let me cross that off my Evernote and there’s email: better answer that.

All of this email of course is not in a vacuum: answering email is step 1 and usually steps 2-48 involve updating some documentation, or sending another email to another person about the email you just got, or doing a power point presentation based off of the email you just received or the email that is due in a couple of days, or updating the excel spreadsheet so you can email the person with that and a link to the other thing about this particular thing, which reminds you about a third thing that you’d better send an email about.

It is a seemingly ceaseless stream if ingress and egress, with me as the human compute between the two; normally I like this but I’ve realized just how much it has taken over my life.  My first inkling was in checking my Delve numbers — my first instinct after seeing them was to be upset my coworkers aren’t as responsive as I am and my second was to realize I could never share these numbers with my husband else I’d get lectured.

The lesson of all of this is that I will make an effort to detach work email from my phone on weekends — or at least occasional weekends — going forward. I can commit to email — but I need to re-establish ground rules.

 

Listening Ears

The last month or so has been an exercise in emotional control and perseverance: there are the usual challenges (it’s the last productive month before people start to serially take off for holidays, trying to eat healthily when people bring in baked goods is difficult, etc.) and new and unwelcome ones (a dear friend has passed on, the car decided I needed to spend some serious cash on it, a coworker is leaving which in turn throws into sharp relief just how much I can separate work and life). As such I haven’t had time to blog or really reflect on much: I’ve spent most of the month reacting and creating contingency plans.

As November is gone and I find myself firmly in the twelfth month, I have either got better at dealing with these challenges or I’ve become numb to their effect. The result is that I can finally take some time to concentrate on a (relatively) new concept: being self-aware and open-minded during challenging times (especially meetings).

We’ve had some training on this recently at the ‘soft, and courtesy of a side-program I’m getting a larger tutorial in how perspective can shape an entire interaction for the better (or worse). Traditionally I am not one to necessarily assume the best of intentions in dealing with someone during conflict — it’s something most people do not default to. (I know of one person who I think can honestly say that during a contentious debate can keep her “opponent” in a positive light; it’s fitting that she is the extremely patient Executive Director of a nonprofit devoted to helping schoolchildren (and teenagers alike)).

The idea of unconscious bias is not a new one, it’s the reason I assume the teenager in the brand new Porsche in front of me is spoiled rotten (instead of thinking they may be enjoying a ride with Mom or Grandma in their car), that the guy who cut me off on the freeway is a jerk (instead of hoping that whatever emergency they’re rushing off to is quickly resolved), that the person at work who hasn’t got back to me is a slacker (instead of positing that their workload is just as heavy as mine). It’s the reason some bosses assume it is ill-advised to hire single mothers (and some deliberately hire them), why some tourists raise their voice to speak English increasingly loudly to the people who don’t understand them, and why most people think NPR listeners are Loony Lefty Libs. (Hi.)

Nor is the concept of self-awareness a new one, if not practiced terribly often. In an era of “selfies” and Kardashians, you’d think self-awareness abounds, but alas it does not. The next time you think you are self aware, check how long it takes you to calm down after an argument with your spouse: that is, once the issue at hand has been resolved and how long until your autonomic nervous system chills out (e.g., your tone of voice changes, your heart rate slows down, you stop grimacing and feeling like you’re still arguing but aren’t really sure about what anymore).

In other words, it’s hard, when the guy has just cut you off and your latte has landed in your lap, to stop and think “gosh I hope he gets there in time”. It’s equally hard to sit in a meeting with someone who is disparaging your product or questioning your priorities to believe they are coming from a positive (or even just productive) space. It’s a skill set to practice and a useful one at work and at home, to be sure.  It’s harder still when the media (“social” and otherwise) is screaming you about the impending Armageddon (be it ISIL or Climate Change or Global Economies or Airbags or Guns or Presidential Candidates), to be positive about much.

The suggested approach (from training, shortly to be invoked in different ways) is to practice active listening: in other words, to let the other person say what they need to say NOT with a view to “how much longer do I have to listen to this drivel” but with an earnest attempt to understand where they are coming from, and acknowledge that position. This, combined with assuming the best of intentions, should serve to deter the impression that the other person is wasting your time/out to get you. The other tool provided includes essentially a “so what are we going to do about it?” mechanism — it’s perfectly fine to air an issue, but come ready to solve it or to commit to solving it. This should serve to ensure that conflict — when it does arise — is used in a positive and productive fashion. These things sound practical and practicable, but I suspect in the heat of the moment they aren’t that easy to call upon. I think, however, it is better to try, in these trying times.

 

 

Pink and Blue

I was at a child’s birthday party/dinner last night and we were discussing pedicures. One of the boys was running around with his nails done and I had stated that I had asked my son if he ever wanted to do that (for he was welcome to come with me) to which he assertively said “no”. When I retold the story I included my pointing out to him that they have “plenty of ‘boy’ colors” to which a friend of mine (appropriately) chided me with the question: “What is a ‘boy’ color?”

She and I both knew the actual text of the conversation was my attempt to clarify to my son that if he wanted to have his nails done a color other than pink or red that that was available these days, and that my use of “boy” was to appeal to a child who had already had societally-driven color choices drilled into him; and that her attempt to tease me was the sort of thing friends do. I am sexist in some ways but not that one. That said it did open the wider discussion of gender color stereotypes, which is one reason I like hanging out with the people I hang out with, because we didn’t have to devolve into a contrived political correctness or a stunted conversation about color choice equality.

To provide some background: one friend has a boy and girl each, another has the same, a third has two girls. I have one boy, and elect to dote on girls by proxy. But this conversation, plus a recent review of the Mindware catalog, got me thinking.

It’s no secret that the toy aisles of Target, or Toys R Us, or really any other mainstream store, are segregated into three types: “Gender Neutral” toys, “Boy” toys, and “Girl” toys. The “Girl” toys aisles are helpfully marked by pink signage and include things like dolls, baby dolls, Barbie dolls, domestic dress-up games, and (possibly) Pink Legos. The “Boy” toys aisles are manfully black and blue and red and include Nerf guns, trucks, traditional Legos, new movie-themed Legos, and Legos that look fairly militaristic. In the middle you have card games, balls, and Slip-and-Slides.

Why do my friends’ daughter’s Legos need to be pink? They are defaulted to build houses and ice cream shops, fine: you can build a perfectly serviceable house out of red and blue and yellow Legos, because we all did that when we were growing up. And if they are red or yellow or blue you don’t have to build a house or ice cream shop, you can build a rocket. Or a scale model of the Revolutionary War.

Why does her son’s Medieval Lego kit have only armed soldiers? Anyone who spent time in their History class knows that most of those soldiers were buffeted 10:1 by peasantry, either in terms of someone to go get the grain that fed them to someone who took care of the horses. There’s no queen, no princess, no baker-lady (for gender roles *were* explicit in those times and so Lego would be perfectly correct to have them in a set identifying as that part of history). But no females of any kind can be found in the kit. This very manful Lego kit is full of manly manliness and action. Girls need not apply.

The Star-Wars new Lego minifigures this year at Target have two (2) females out of what looks to be about 50 new minifigures. They are Leia in the gold bikini and Padme in the ripped-midriff shirt. Not Leia in her Cloud City costume, or Padme in her full regalia (Lego has found a way to make capes, they could’ve rocked this). (If you go to lego.wikia.com you can see the full minifigure line-up, and there are plenty of other female Star Wars minifigures. But they’re not on display at the Target.)

Why must Goldieblox, that bastion of gender correction in educational toys, be pink for girls? The whole point to Goldieblox was to get girls interested in engineering, to rail against the pink conformist aisles of toys for girls. Why, then, is it pink? Why are the cookware “play sets” for children almost always manned by a girl? My son loves to cook. But the photos on the box are clearly stating that this is a largely female domain.

While there are darned few pink toys in Mindware, there are far more boys in the pages of toys that are about engineering and far more girls shown next to the beading kits. If we can’t get our arched-brow, intelligent-snowflake-producing toy catalogs to play fair how do we ever hope to rescue our girls from the clutches of Barbie?

From the 20’s to WWII, the color “ownership” between boys and girls was inverted: boys wore pink and girls wore blue. Before that children wore white (which, if you think about it, is practical as long as you have a ton of bleach lying about). After WWII this reversed and we have the color “preferences” we see today. With all the challenges our kids have and will have (underfunded schools, bullying made easier via the internet — and this is not just kids, as adults have been doing that for years) can we please not heap upon them color “correctness”?

When you’re first pregnant the question you are asked is “do you know yet if it’s a boy or a girl?”  Some parents elect to make it a surprise and there’s that moment of “discomfort” for the families because now they “have” to shop for yellow or green baby gear. I honestly don’t think the baby will care if s/he is housed in blue or pink; the larger purpose is the social signaling device for this tiny creature to his/her audience as to what his/her gender is. I’m considering making onesies for babies that say things like “Manly enough to wear pink” and “Girly enough to wear blue” just to poke fun at this.

We live in a country of astounding wealth and opportunity and we have larger problems than the color or gender appropriateness of toys. The only way to shape and change the offerings is to vote with your dollars; it’s effective but it takes more patience than some of us have.

 

Ramp

As I have just recently changed jobs, which entailed leaving one job (and all of the transitory madness that is associated with that), having a purported week off (more about that later), and then starting the new job (I’m almost two weeks in), I’ve been a bit busy.

A prudent me would have curbed social engagements, extracurricular activities, and given myself some slack at the gym. But prudence is not one of the words that comes to mind when I think of me (although someone called me quirky the other day so now my quirky meter has gone up a bit and I need to see what *that* is all about), and I didn’t. One of the organizations I help with staged an intervention and dropped me from 2 of the 4 committees I was on, not because they doubted my abilities, but they feared for my sanity.

I spent my week off with a healthy checklist and a desire to make my son’s last Elementary School science fair something to behold. I think I marginally succeeded, given that the rules were simple: no fire, no liquids, no electrics. Naturally, we had all 3, including one experiment demonstration involving fire (that I had to forcibly shut down), one exhibit requiring not one but two extension cords, and a couple of suspicious watery areas on the floor of the gym (where the student exhibits were at). By the end of my week off, I was ready for a week off.

And then I started working at the new job.

Starting a new job is both exciting and sucks at the same time: exciting because everything is New And Different And Thrilling And Did I Mention New, and sucks because Guess What, I Don’t Know Everything — Or Possibly Anything– Anymore. It’s that awkward phase of not knowing a company, or any of your coworkers, or (in my case) your platform. My days are spent in Outlook, PowerPoint, and meetings; my evenings are spent learning a new language (when I so choose). There is more work to be done than can ever be done, and so the challenge of work-life balance rests solely in my court.

This sort of disciplinary requirement, plus the uncomfortable position of Not Knowing Everything, makes for an unsettling period. Throw in the end of the school year (6 weeks to go!), a couple of trips, a pending wedding (erm… mine), and a neglected garden, and you’ve got the recipe for an OCD breakdown. I may have required myself to wash the sink twice and empty out the dishwasher completely before I allowed myself to eat dinner tonight at 8:30 when I got home.

Which is all a very long and rambly way to say: I’m a bit swamped at the moment, and sorry I haven’t written. Fresh content is on the way, it’s just stewing in the back of my brain. Specific blurbs will include: The Sadness That is Washington State Education and Funding, Hot Yoga (the Opinion of a Reluctant Convert), and Woodinville Wine Country: You Aren’t as Witty as You Think You Are.

Stay tuned…

In Development

I was at a holiday gathering the other day and during the usual course of “…And what do you do?” I replied that I was a developer. The inference was that I was a Real Estate Developer; I had to explain that I was a Make the Computer Do Useful Things Developer. I was talking to two ladies about my age (Hi, I’m 40), and was surprised at the reply: “Oh, that’s unusual!”

I suppose I should not have been. I know a lot of women in IT, but darned few who do development.  To be clear: most of the women I know in the Information Technology space were at one point developers, or have a passing knowledge of some development language. They merged into Project or Product Management, or Business Analyst roles. These roles require knowing what is possible of code without actually having to write any of it, and so if you get tired of the incessant progress of development technology then that is one way up and out (and it is a way I took, about five years ago).

Careers arc and opportunities knock and itches flare up and I am once again a developer.  And I find myself, when talking to people who don’t work with or know other developers, battling not only the usual misconceptions about development, but the gender-based ones as well.

Development (in IT terms) is the handle one applies to the concept of using a series of commands (code) to tell the box (tower, laptop, server, etc.) what you want it to do; if you want it to take in something or not, if you want it to spit out something or not. In order to create this blog post many people did varying forms of development (from creating the templates that instruct the browser how to make this post look all shiny, to the protocols that tell the server where to put this post, to the widgets on the front end that tell you things like I haven’t posted in a while). If I typed it in MS Word, that required a bunch of other development by a bunch of other people.

Development is not:

  1. Something you can do on five screens drinking 3 bottles of wine to create a “worm” that appears as a graphic on your screen (as in Swordfish), and usually doesn’t involve a developer logging an Easter Egg of themselves in a bad Elvis costume with sound effects (as in Jurassic Park)*. If I drank 3 bottles of wine and was looking at 5 screens they’d probably be the ones you see in a hospital room, and the only graphics I would see appearing would be the “worm” that is my heart rate monitor flat-line.  And while I have myself buried Easter Eggs and commentary in code, it isn’t that elaborate because you don’t typically have time to build elaborate things. You’re busy rewriting all of the stuff you just wrote because someone decided to change the scope of your work.
  2. Anything involving a graphic user interface (GUI). When a developer talks about manipulating objects, they are things that are typed out phrases, they are not boxes that are dragged and dropped. There are some development environments that offer up a GUI in tandem with the “scripting” – that bit about writing out words I was talking about – but they are there to illustrate what you have scripted more often than not, and not there to assist in your scripting.
  3. Finite. Development technology is constantly changing and no one developer knows all of the development methods or languages. That would be like someone knowing all of the spoken languages in the world. Rather, it’s typical you’ll find one developer who “speaks” one development language really well, or maybe a branch of languages (much like you run into a person who can speak Spanish and French and Italian, because they are rooted in the same “base” of Latin, it’s not uncommon to find someone who can code in ASP.Net and VB.Net and C#.Net, because they’re all of the Microsoftian .Net base).  No one hires “a developer”, they hire a .Net Developer or a Java Developer or a Ruby Developer or what have you. Specialization exists because the base is so broad.

Modern cinema has done an injustice to developers in terms of making what we do seem both simple and sexy; the “shiny” environments typified by the interfaces “hackers” use on-screen looks really slick and probably took some real developer hours of time to make look good… with absolutely no real purpose. That said, actual development can be simple (with clear requirements and a decent knowledge of the things you can and can’t do) and can be quite sexy (if you’re sapiosexual). It’s just not well-translated in current media. (To wit: Jeff Goldblum uploaded a Virus to an alien system on a Macbook. He didn’t have to know the alien system’s base language, machinery, indexes, program constraints, functions, etc. And it was on a Mac, in the 90’s, for which development was not one of its strengths).

Most of what development is, is trying to solve a problem (or two), and generating endless logic loops and frustrations along the way. You build a “thing”, you think it works, you go to compile it or make it run, it fails, you go dig through what you wrote, find you’re missing a “;” or a “,” or an “END” or a “GO” or a “}”, re-run, find it fails, and go dig through some more. For every hour you spend writing out what you want it to do, you spend about an hour figuring out why it won’t do it.  This process of “expected failure” is not sexy or shiny or ideal, and that’s why it doesn’t show up on-screen.

These are misconceptions every developer, regardless of gender, has had to deal with at some point. Some deign to explain, some gloss over, some simply ignore; much like I really hope we get a socially-functioning, intelligent person on-screen soon, so do I hope that we get a showcase for the simple elegance of real development.

It would be great, too, if there were more female developers on “display” as well (and not for their bodies, hence the scare quotes).  Think through every movie you’ve ever seen that shows people doing any real development, “hacking” even (a term that is abused beyond recognition); how many were female? Go back to the movie “Hackers”—did Angelina Jolie actually, ever, really type anything? You inferred that she did, but the real development, the real “hacking”, was done by the crew-of-guys. Oh, and that’s right, she was the only girl.  The Matrix? Carrie Ann Moss spent precious little time in front of a computer there. She did look damn good in skin-tight leather.

Fast-forward a decade (or two) and we’re pretty much in the same boat. You see women behind computers on-screen, but they are typing in word processing programs or moving the mouse to click it on the shiny picture of the Murderer/Prospective Boyfriend (or, you know, both). They aren’t buried under a desk trying to trace a network cable or eyeballing multicolored text trying to figure out *WHY* it won’t compile, they’re delivering the shiny printout to the Chief/Doctor/Editor from which Decisions Will Be Made.

We find it surprising in social circles, I suppose, for women to be in development, because we don’t see it exemplified or displayed in any of our mediums.  TV, Movies, even proto-development toys for children often feature eager-looking boys interacting with them, the girls are reserved for the beading kits and temporary tattoo sets (actually, there’s precious little out there for getting your child, regardless of gender, to learn code, but that is changing). We have crime-solving anthropologists, we have NCIS ass-kickers, we have cops and coroners;  maybe it’s time we had a developer.

*Jurassic Park is a good example of both great and poor development display. Right before tripping that “Dennis Nedry Elvis Graphic”, Samuel L. Jackson’s character is eyeballing Nedry’s code. That stuff that looks like sentences that don’t make sense? That’s code. That’s what it looks like, for the most part. Unfortunately, later on when the little girl is hacking the “Unix System” that “she knows”, it’s all graphical. And that’s not accurate.

Cliché

Clichés, as a rule, bother me. This has to do with my innate dislike for anything that must “be accepted”. The absolute BEST way to get me to not read a book, not see a movie, not do something, is to tell me I MUST read XYZ book, I MUST see XYZ movie, I MUST do whatever. It just won’t happen. If I’m in “polite” mode I will dither, if you are family I *may* humor you, but otherwise it’s just not going to happen. This explains why I still haven’t seen the “Breakfast Club”, why it took some serious cajoling to read Lean In (yes, yes, blog post coming about it eventually), and why, at 40, I don’t know if I own a hairdryer because I simply refuse to use one.

Clichés are the verbal “you must”. It suggests that there is something out there you must do, or must allow, because it just *is*. The absolute worst one, in my opinion, is “Everything Happens For A Reason”.

Please. Just… don’t.

Things happen because they happen. There is little reason in someone going in to a school and shooting children, there is little reason in the antics of Congress (these days), there is little reason in Wall Street (as evidenced by a DOW nearing 16k whilst we have the hurdles we have. There need not be, and frequently there is not, a reason.

Saying “Everything happens for a reason” is a way of accepting a lack of control; it means “I can’t see a good reason for this to happen in a logical world so I will abuse this platitude and hope this changes the subject and/or gets the person who is trembling with doubt, pain, or hurt to stop it long enough for me to be comfortable”. Looking for “reason” where no good one is, is insanity. Or optimism.

I’m more of a fan of “It is what it is.” “Que sera, sera”, however sung by Doris Day, is accurate. Things happen: this much is true. Entropy increases. Time marches. But the notion that there is some underlying reason causing a typhoon to kill off five thousand people, or a tsunami and earthquake to hit the site of a nuclear reactor, is asinine.

OK: Point and counterpoint. Correlation and causality. That is to say, YES, ultimately there is a cause to every effect.  A ginormous typhoon hit the Philippines because global warming has warmed the atmosphere and waterways in that area to a devastating effect and the bomb that would go off there went off with a bigger bang; people tend to build nuclear reactors near waterways in order to easily flood the site to cool it down. But when people say “Things happen for a reason” they do NOT mean, “things happen because a series of events led to them”, they mean, “there is some good reason for this to have happened” and “good reason” usually infers somehow, somewhere, there is benefit.

You will notice that very rarely does anyone say “Things happen for a reason” where something happens that is obviously beneficial. “Things happen for a reason” is not applied to the lottery win, or the quick reflexes that get you OUT of a car accident, or the “A” you got on your Chemistry final. No, then you take the credit: you studied, you had quick reflexes, YOU picked the lucky number. So if it’s good, you controlled it with your abilities and your skills; if it’s bad there must be some better reason for you to have fallen on misfortune.

I have been in plenty of good circumstance that was of my own doing, and nearly as much malfeasance that was as well. I do not attribute this to “fortune”, I attribute this to the way things are. It is what it is.

But it may not have happened for a “reason”.

Yes, It Was The Right Choice

Five months ago I accepted a new job with Sur La Table. I had spent nine years at Expedia doing a variety of things, and learning a tremendous lot, but it was definitely time to move on and be the “fresh blood” somewhere else. As I gleefully told my family, friends, and professional associates of my move, I got mainly 3 reactions:

1. That’s great… what do they do again?

2. That’s great… wait, you’re moving from Director to Manager?

3. That’s great… are you making more money?

I can sort of see the first reaction, if you’re talking to someone who’s not in one of the 27 states that SLT operates in, and/or you don’t cook. (I am not judging.  Yours truly has a few friends who know an awful lot about food but you shan’t let them in the kitchen). The other two have been reiterated so often that I figured I’d just answer them here, and then point people to it.

1. Sur La Table (www.surlatable.com) is a store, and site, for cook’s tools and entertaining. That’s it. You are not going to find beekeeping outfits, a large selection of scented candles, ironing boards, etc. You are going to find a wide selection of knives and people who can tell you how to use and care for them, because they know. You are going to find a variety of stove top cookware, in a variety of materials and colors, and any one of the people wearing a Sur La Table apron can tell you, depending on YOUR cooking style and YOUR stove what will work for YOU. In more than half of the locations you will find a roster of classes you can take that will teach you everything from how to use your knife properly to how to make homemade pasta to how to do five recipes on one grill for six people.

2. Yes, I moved from a Director to a Manager. Specifically the course was Director of Business Development to Director of Content to Applications Development Manager. And here’s your first clue why “different” does not mean “downward”: I went from what was essentially inflated project management (with a bit of ability to direct the change that instantiated the project) to Operations management to development management. With each step the skill set gets broader, and deeper. Project management is about managing people you don’t technically manage, Operations management is about managing people you manage and managing by proxy.  Development management is all of the above and now you get to speak two languages: business and technology.

I could go on: development offers a chance to actually BUILD THINGS, the reality that a Director at Expedia is not equivalent to a Director at Microsoft is not equivalent to a Director at Sur La Table, in either breadth of responsibility or in terms of compensation. And frankly, I’m mercenary enough to be happily titled the Hobgoblin of Object Oriented Programming if they pay me enough, which leads us to…

3. Yes. I mean, I can offer the logic that benefit packages from Company A to Company B require careful weighing and measuring, and that there are quality of life trade-offs with commuting time, etc.  But any way you slice it, frankly, the answer is yes. Anyone who tells you that “Retail” is this or “Technology” is that is at best over-generalizing and at worst missing opportunities.

None of this answers the question, four (working) months later, of “Are you enjoying it” and the answer is an unqualified YES. Do not get me wrong, there have been seriously frustrating times. Sur La Table has been around since the 70’s but its growth pattern is such that it *feels* like a start-up, with all that that entails. Development has to run quickly and there is enormous demand for my department, which leads to both the wonderful sensation that “we can DO this” combined with “OMG how are we gonna do this??” There’s a bit of “hey let’s go down this path… no wait that path… no let’s go down the first path” that you see in nascent organizations, and for someone who was at a company that went from start-up (well, close to, it was about 4 years in) to Mature Large Company in my tenure, there’s the urge to be much farther along the development path than we are.

Then again, it affords me (us, really) the opportunity to be there to make the changes that need to be made, and build the cool, fun stuff that needs to be built. That, by far, is the best reason.

News at 140 Characters per Second

A couple of days ago, I was eyeballing my Twitter feed and it “exploded” — tweets came at a furious pace, retweeting, modified tweeting, quoted tweeting, fresh tweeting. Tweets with links, tweets with emoticons, serious tweets and facetious tweets. All of them (barring Sponsored Tweets, which are something I’d pay to NOT have to see) were about the Fed’s Q&A session.

I didn’t have to watch it (I caught clips later). I had, quite literally, a play-by-play review from journalists, editors, friends, co-workers, and friends-of-friends of every question, position, response, and impact. “Knowing”, as I do, most of these sources, I could tell who was being predictably circumspect, who was flying off the handle, and who was simply “reporting”. I had a dozen neatly arranged bits of data at my fingertips.

This is the same Twitter feed that gave me an equally determined and detailed vision of “Sharknado”, the deliberately cheesy SciFy flick. (It was what it sounded like: Sharks. In a Tornado.) Quite possibly the best thing I read about that was that the special effects were akin to dropping 3 bowling balls in a bucket filled with a 50/50 mix of “Motor Oil and Kool-Aid” (that, from NPR).

I’ve heard Twitter criticized as the medium of the vapid, a haven for narcissists, a cocktail party happening at 140-character snippets. These are, actually, all accurate impressions. Twitter is chock-a-block FULL of vapid narcissists (um, hi!) and is very much like a cocktail party. The trick with a cocktail party, though, aside from eating a bit beforehand and judiciously measuring your alcohol intake, is to not stick yourself with a group of people who 1. don’t tend to agree with you, unless you’re that rare creature who can handle an honest debate, and 2. find the group of people with the discussion base that interests you. If that happens to be the Kardashians, well, enjoy. I won’t be with you, though.

To some extent Twitter is a very personalized “news” feed, and I say that with “air quotes”/aka. “Bunny Rabbit Ears” because “news” is something as a concept that is bastardized near and far. Al-Jazeera Egypt is now even subject to scrutiny in its authenticity, I’ve heard Fox News called “Faux News” and even CNN has had criticism. I personally float to the Economist and the Guardian, because if you’re going to get brutally fair journalism you’re going to get it from a race that self-flaggelates as a cultural point of pride. It’s further personalized by the fact that  you’re unlikely to “follow” anyone who irritates you or annoys you, much as you’re not likely to grab your wine/vodka tonic/beer/margarita/iced tea and stand next to that asshole you wished the hostess wouldn’t invite to her party. You can safely intake your news with whatever bias you prefer, and get it that way.

An interesting thing that happens, though, in the Twitterverse, is the concept of the “retweet”. You may not stand next to the asshole at the party, but his voice can carry. You can attempt to tune it out, but someone may (conspirationally, mischievously, inaptly) repeat exactly what he said in a “You wouldn’t believe what [the asshole] just said” sort of way. Ladies and Gentlemen, enter the retweet. Retweeting is not limited to “hey look this person thinks like I do” but can also be an entrée to “Holy shit can you believe this douchebag just said that?”. In a world where you are not tolerant enough of the douchebag to follow him/her, chances are someone in your Twittersphere is, and will let you know what s/he said. Twitter is therefore no more, or less, useful than any other medium of news delivery we have had to date. It’s just delivered in an abbreviated fashion.

That may be a blessing.