On Driving An Electric Car

I had a new experience: I drove an all-electric car. I come from a yuppified, Prius-driving household, but I’d never actually driven a purely electric car-and I thought I’d share a couple on notes.

Electric Car2go

The first thing I noticed was that I couldn’t initially tell that the car was on. Normally when I’m in a Smart Car I turn the key, the engine roars (okay, it’s more like the sputter of a lawn mower…) to life and that’s my instruction to release the key so I don’t destroy the starter engine.

There’s no such sound with an electric engine. If you’re me, you just sit there turning the key coming to the slow realization that something’s happened, and then you noticed the dial:

Inside electric car2go

That’s the only signal you’ll get that the car is actually on.

When you start moving, it’s just the sound of the tires on the road and the occasional high-pitched whine of the engine when you press the accelerator (which really should be called something else because there is barely any pickup).

I did notice one really weird thing while driving: the car is by default in neutral if the accelerator isn’t depressed.

I’m one of those people who releases the brake just before the light starts so that I can get the car in motion and then accelerate smoothly. However, when I released the brake on this car, I found myself slowly rolling backwards. Rather than being in first gear, the car was in neutral; this meant that I had to keep constantly pressing the accelerator to keep it moving.

Neat experience and a lot of fun; hopefully I’ll get a chance to try another one soon.

Cameratastic

One of the joys of owning a smartphone is installing camera apps on it so that you can take crummy photos and then make yourself look like Monet or Chuck Close without having to boot up Photoshop.

I’ve been playing with a few apps recently (beyond the granddaddy of ’em all, Instagram), thought I’d share them with you.

Here’s my original image, a middling photo of some crocuses:

Original - Crocuses

Here’s what it looks like after being passed through Painteresque:

Painteresque - Crocuses

Want more of an etched look instead of a painted look? Try the eponymous Etchings:

Etchings - Crocuses

Or perhaps you want a little more abstract. Try Percolator:

Percolator - Crocuses

If you percolate it further, you’ll get this:

Percolator - Crocuses

For a more playful look, try Halftone:

Halftone - Crocuses

A few random thoughts:

  • Will we see these filters become a core part of the built-in camera? Will iOS or Android ever cannibalize their ecosystem by pulling these in to the main app?
  • Everyone loves apps that makes them look like a professional; these apps do a fantastic job of this. I wonder what other categories on the phone exist for this?

Drones as a Service

Earlier today my tech partner and I were grabbing coffee and shooting the shit about all the cool physical devices that are appearing these days. Spheros and Myos. FitBits and Jumps and FuelBands. Arduinos and Raspberry Pi’s. Kinects and Pebbles and quadcopters.

It will be fascinating to see how this all plays out. Some are platform plays and will require an ecosystem; others are vertically oriented. Who knows which one will create the most value; if it was obvious the people who love these things wouldn’t be called “early adopters.”

Anyways, we were trying to imagine what the world will look like when some of these technologies go mainstream – and that led us to quadcopters.

Currently used mainly to bounce balls in Switzerland and chase kangaroos in Australia, it’s worth considering what the world looks like when they’re mainstream items.

You can imagine lots of people wanting access to drones as a way to provide a set of remote eyes-and not just the cops.

Want your roof fixed? The roofer sends the drone to look and then gives you an accurate quote.

Issue with nearby cell tower or electric utility? Send the drone over before sending a human so that we can make sure the truck’s got all the right gear.

Neighborhood watch sees something funny? You guessed it: fire up the drone.

City needs some real time data on the ‘hood? Bingo. Drone time.

Now, the key thing here is that most of these organizations don’t need a full time quadcopter drone-and they’d rather not have to worry about maintaining it, etc.

And this is where the idea of Drones-As-A-Service (DAAS) is born.

Basically, someone sets up a network of drones (we already have spaces for them: they’re currently called cellphone towers) and creates an API or web form. If you need the drone, you just fill out the form stating what you need it do for you (“fly to roof at 1515 West 2nd Ave and execute site sweep. Pause for close-ups.”). You can even imagine an app that notifies you when the drone is available and let’s you communicate with the pilot (alas, she’ll still have to manually fire the Hellfire missile at your neighbour).

I’ve thought about this idea for about as long as it took to write this blog post, so maybe it’s the stupidest idea ever, but thought it was entertaining enough to share with y’all.

Algorithms Without Feeling

I was playing around with the Google Maps API’s Autocomplete today. This is the feature that lets you type something like “Vancouver” and turn it into the corresponding city. As you type each letter, it guesses what place you’re looking for.

Google, being Google, has put a remarkable amount of intelligence into making this work.

For example, you can specify whether you want all types of places or just businesses or just city-like entities, etc.

The set of guesses is also dynamic. Google is famous for using data from one arm of the business to train other parts of the business. For instance, while taking Street View photos they’re also training driverless cars. And it looks like they use the frequency with which people search for location names to best determine which places you might be looking for.

How do I know this? Well, here’s the autocomplete result from this afternoon when I restricted it just businesses.

Type in the letter “s” and, awkwardly, the first result is Sandy Hook Elementary School – the site of last year’s horrific school shooting.

Autocomplete Example

Google’s unfeeling algorithm has no idea that Sandy Hook is infamous and almost certainly not the place people are looking for so it’s going to keep serving it up.

Here’s a Google Trend report on the term “Sandy Hook Elementary School”. You can see the spike and decay in “interest”. Apparently it hasn’t decayed enough to have been removed from the autocomplete:

Google Trends

 

There’s nothing nefarious about Google’s algorithm (and dealing with Black Swan events that skew your search data is tough) but they make us aware of the odd new world we live in.

Unemotional algorithms dictate what is topical; this shows up in unexpected locations and we can literally watch and measure the rate at which moments slip into our collective memory.

Throwing Light

For Christmas I bought Wen an “Etch” by Tom Dixon. It’s a tetradecahedron (14 sides; mixture of hexagons and squares) where 13 of the faces have hundreds of holes cut in them and one face is open. You’re supposed to leave the open face up and drop a candle in it.

However, our apartment is full of candles (they were my mom’s gift of choice for years; we have a great selection) so I’ve basically sworn that I’m not allowed to bring any into the house. Plus, I thought it would be more interesting if the Etch was flipped upside down and instead powered by a brighter-than-a-candle-and-more-colourful-too BlinkM (an LED that can be programmed to cycle through colours).

Here’s the result:

I built a little box to house the BlinkM and it’s battery pack; the Etch sits on top of it. The video doesn’t do it justice (I don’t really know how to do low lighting video) and definitely doesn’t capture the range of lights and shadow-or the nifty projections it makes on the ceiling.

Here are a couple of photos to give you a better idea of what it looks like when lit up: