Archive for the ‘Startups’ Category

Look to the Heavens: a Parable

“Computer science is no more about computers than astronomy is about telescopes.”
— Edsger Dijkstra (possibly misattributed)

When I was young, I enjoyed looking at the stars. I remember tilting my head upward and seeing the universe unfold before me, untold wonders, billions of brilliant points of light. I would lie for hours on my back, letting the stars crawl across the sky, lost in the beauty of the world beyond me.

Later I would learn astrophysics, and study the fusion reactions in our sun, the main sequence, and life cycle of the stars. I was fascinated by how the first stars had fused hydrogen in to heavier elements and then exploded, propelling carbon and oxygen across the vast expanse to become part of planet earth. They left majestic dust clouds in their wake, reflecting the light of a thousand thousand solar furnaces. This light travelled across space and time, quantities that I could measure and calculate. My study of the stars served only to augment my wonder.

Slowly, at first, others began to take interest in my passion. They were enthralled by telescopes, which let them see the stars with increasing clarity and precision. My field was crowded with ever more telescopes and their advocates. Soon, an arms race began. Where once we had to manually position telescopes with sights and paper charts, new models were motorized and at the touch of a button would move themselves to view any star desired. Now even these are passé. The reigning champion is a device with a screen that, as it is moved in space, displays the stars on the other side overlaid with their names and constellations. The very act of using this device, however, obscures the stars themselves.

Astronomy became ever more popular. One January, an upstart startup declared it was Astro Year, and that everybody should learn how to operate a telescope. Everyone was claiming to have the Next Big Telescope that would simplify our lives. Startups promised telescope boot camps, bypassing the time and cost of a traditional college education. “Do you know that job openings for astronomers outnumber applicants 2.3 to one?” I was asked. I shrugged. I wouldn’t turn down a job, but that wasn’t the reason I was here. All I wanted to do was look up at the stars.

* * *

You walk into a small domed building on a mysterious island. There is a plush red chair in the middle of the room, reclined like one found in a dentist’s office. You sit down, and notice a black box overhead, within reach. There are sliders and a small screen that is dark. You move the sliders but nothing happens. You get up to leave, but you notice a light switch next to the door. You flip it, and the room darkens. On a hunch you lie back in the chair and begin to move the sliders. The screen comes to life with star patterns and a date. Moving the sliders, you are able to see a patch of sky for every night for several years.

The above is a scene from Myst, the 1993 exploration/puzzle video game. Today, it seems awfully quaint, to dedicate an entire building to a single-purpose computer. Indeed, the handheld planetarium described above is a free iPhone app. But there was a certain allure, a certain prominence of rarity, that once characterized both stargazing and computing. Now both have been cheapened by our ubiquitous devices, and the influx of uninspired practitioners. Once, before my time, we had to fight for time on a mainframe; now anyone can learn to code.

Yes, anyone can code, in the same way that anyone can operate a telescope. But can you look up at the stars and wonder? Can you explain the stars themselves, apart from the feeble and transient machines we use to view them? Do recursive algorithms and finite state machines keep you up at night, or only wake you in the morning? We don’t need more coders and we certainly don’t need more code. We need more computer scientists, who look look up to the stars, rather than down on their machines.

There’s not an app for that

The post originally appeared on the Tufts ACM blog. 

I was thrilled when I heard Tufts was hosting a hackathon, I was thrilled. A collaborative all-nighter coding with sponsorship and prizes from big-name companies in my backyard? It would be programming nirvana. Sign me up!

But when git push came to shove, most of the ideas were built on the “social, local, mobile” paradigm. That didn’t strike me as particularly innovative. We already have Facebook and the iPhone. I resisted the notion that everything is better as a mobile app.

I wanted an app that allows me to have more conversations. I brainstormed such an app with a colleague, which would have been called, “Get off your phone!”. It would serve up suggestions like “look up at the unique architecture” or “go talk to that person over there,” but we concluded that even if you put in a rating system it would just be a place for crude jokes and pickup lines. I abandoned the idea, along with hope in a hair-of-the-dog solution. I wound up working on a school project instead.

One of the things I love about computer science is how diverse its applications are. You can mine data for scientific research, or design a clean and functional website, or write utilities that help manage a company’s workflow, or make a video game, or just become more aware of the patterns and relations of the things around us. Not everything we build has to be a time-waster for rich people (and on the global scale, we are all rich people). In fact, to build only these apps is a shameful waste of the privileged position we occupy as educated people.

The future is here. We all carry devices in our pockets that can tell us anything, anywhere. We’re living the always-on dream, but it’s time to wake up. We have crises in real life to attend to, like feeding seven billion people while fighting climate change.

There are some things apps just can’t do, and I don’t mean solve the halting problem. Our devices have caused us to be less connected, not more. Put the toy technologies back in your pocket. Time to go do something else now. Have coffee with a good friend, perhaps. Or at least a Monster with a teammate.

Boston I/O

Around 200 college students from the  greater Boston area attended the Boston I/O conference today. It was a rapidfire series of presentations. Starting with a talk on the local entrepreneurial environment (which, I found out, is comparable to NYC and SF), it was a parade of techniques and technologies. Each was promised by its presenter to solve all our problems. Some, like special design considerations for mobile and test-driven development were actually informative and had good ideas. Other presentation topics, like providing and using APIs and object-oriented programming, were touted as innovative when they are already part and parcel of the CS curriculum. Software like Ruby on Rails, PostGreSQL, the Bourbon CSS library, or jQuery Mobile are even worse. They are transient tools that have come and will go. They are old wine in new bottles, and even newer bottles are being made all the time.

I exempt the text editor Vim from this fate for three reasons. One, I use Vim myself and enjoy the power it gives me, even if I only use a small fraction of it. Two, text editing has more staying power than the current web fad. Three, presenter Ben Orenstein did an excellent job of making a text editor look exciting and fun. (“This is like a video game you play at work.”)

Someone in the crowd mentioned Ajax, and I asked if I needed to know what it was. I was assured every major company used it to provide asynchronous experiences to their users. But seriously. Clojure? Django? Drupal? JSON? PhoneGap? Do all web and mobile developers have ADD? No wonder startups are starved for talent – no matter how many technologies you learn, someone else wants you to learn another. There’s always something you don’t know. Or is there?

As I said above, many of these techniques have been and will continue to be recycled. The are only so many ideas out there; only the names they go by are infinite. If I learn MVC architecture from building an iPhone app, write a game in Python, and study the HTTP protocol, do I really need to learn Rails? Programming isn’t about having a Home Depot’s worth of tools; it’s about having a toolbelt’s worth that have been carefully selected for power, precision, and speed. Oh, and a brain’s worth of knowledge on when and how to use them.

Kate Rutter, a user experience (UX) guru who skyped in, made some important points of lasting value. Design, said Rutter, is about empathy. The UX comes before a single line of code is written, because before we “build things right”, we need to “build the right things”. Users don’t want features; they want uses. I believe that computers exist to serve humans, not the other way around, but Rutter makes a very practical point. No amount of aesthetic design, testing, code refactoring, or mobile hardware matters if the features serve no purpose; all that work is wasted.

I wonder how many of my fellow programmers, in the hectic startup culture and development cycle, ever pause to reflect. Are you working on a fad that will inevitably give way to The Next Big Thing? Are you working on a dead-end project that no one will want to use? Are you so caught up in buzzwords you fail to see the history of ideas trailing behind them? I’m not.

Don’t get me wrong, Boston I/O was a worthwhile event. But I can’t help but wonder how many of the technologies we heard about will still exist in ten years, and how many will be piled on the unforgiving junkheap of history.

Hacking Medicine

Max. Go to this!!!!!! Serious!!! Register now. Please. U Wont regret!!

That, and a link, comprised the email I got from brother-in-law(-to-be) telling me about a conference called Hacking Medicine that took place last weekend. I had my doubts about getting in, but I applied, and somehow I was accepted. Long story short, I got to spend two days with a roomful of incredibly smart and awesome people at the MIT Media Lab. Continue reading