The Internet of Things

Cloud, Big Data, and now the Internet of Things? Only one of them is being developed in garages. We explore the impact of connected objects, and how it is more than just the latest in a round of buzzwords.

In March 2011, after the devastating earthquake and tsunami, the Fukushima Dai-ichi Nuclear Power Plant slid into disaster, information on radioactive emissions was hard to come by. As the Tokyo Electric Company, the plants operator, and the Japanese Government scrambled to bring the situation, at the failing plant, under control, the public was left wondering what was happening, and how it might affect their livehoods.

It was this void of information that Safecast, a group of volunteers founded just a week after the Tsunami, wanted to fill. Equipped with Geiger counters, Safecast set out to map radiation levels and make that information publicly available, and lift the fog of confusion and misinformation around the nuclear power plant. To this day, Safecast has collected more than 3.2m radiation measurements and has vastly expanded its scope from mapping the radiation levels in Japan after the nuclear crisis to gathering the base-level radiation data for the planet.

Safecast was met with efforts by tinkering hobbyists, like the members of the Tokyo Hackerspace, to feed information that was critical onto the internet in real-time. Using donated Geiger counters and combining them with Arduino development boards, an open source electronics prototyping platform, they were able to do just that. Using platforms like Cosm (formerly Pachube), which connects people to devices, applications and the Internet of Things, it was easy to aggregate all that data in one easily usable place, which sparked visualisations like Haiyan Zhangs ‘Japan Geigermap’, which uses the information available and presents them in easily understandable form, giving meaning to the measurements gathered in Japan.

This interplay of sensor measurements of the world, data collection and aggregation, and presentation in an understandable, actionable way, is at the heart of the Internet of Things, and the response to the Fukushima Dai-ichi disaster makes for one of the most compelling cases.

The ‘Internet of Things’ is a term first coined in 1999 by the Auto-ID Center, an MIT research lab trying to describe the phenomena triggered by slowly growing adoption of radio frequency identification chips (RFID) which gave objects a unique, digital address. The assumption behind the original concept of Auto-ID was that both the chips as well as the card readers will see such a massive adoption, that every object can be located anywhere, thus creating a digital map of the world’s objects, an Internet of Things.

The modern interpretation of the Internet of Things goes far beyond that and is probably better captured by ‘things on the internet’. It tries to describe an increasing prevalence of ‘digital hybrids’, everyday devices that have digital capabilities and can communicate on the Internet. Computation does not only happen in traditional computers anymore, but gets woven into regular objects. It is by no means a futurist scenario anymore. The latest numbers show that non-human operated devices connected to the wider internet outnumber humans with internet access 5bn to 2.3bn. On the Internet, humans are already a minority.

When Gordon Moore formulated his, now famous, law in 1965 it would have been an act of insanity to predict it still holds true today, almost 50 years later. Moore himself projected his law would hold for two decades. But we find ourselves in 2012 and Moore’s law is still alive and kicking. The exponential growth curve affords us ever cheaper and more powerful computation, and technological innovation along with it—and it’s not only Moore’s law that fuels this expansion. Recently, Stanford Professor Jonathan Koomey coined ‘Koomey’s Law’, a corollary to Moore’s Law, which indicates that the energy efficiency of computation doubles every one and a half years. In more concrete terms, this means that a current industry standard laptop would drain its battery completely in a matter of 1.5 seconds if it was working at 1990 energy efficiency levels.

The effects of these two laws make it feasible to implement chips into all kinds of different applications. Ranging from sports equipment to medical appliances to car sharing services, networked objects and services are becoming commonplace and will fuel the next big expanse of technology.

Industry has of course taken notice of these trends, and applies the correlating technical advances. Supply chain managers for most big corporations know at any time where their cargo is, because of embedded chips which relay location data to central control servers. The well known Smart Grid was aiming to do precisely this by bringing 20th century power grids into 21st century technology. By adding communications infrastructure and a multitude of sensors onto the electrical grid, the hope is that it will make the supply more robust and efficient. Especially with the increasing role that intermittent, renewable energy sources play in the overall supply scenario, more real-time information is needed to ensure stability in the grid. Consumers profit from this wealth of information as well. From in-home energy dashboards to monthly analytics reports for their energy consumption, these tools give them means to make sense of their energy consumption.

And then there’s SFPark, an experiment by the City of San Francisco. Confronted with the ubiquitous problem of scarce parking space, the city set out to do something about it. Equipping parking spaces in downtown San Francisco with sensors, which can detect the occupancy of parking spaces, the San Francisco Municipal Transportation Agency set out to change the dynamics of parking by introducing variable parking tariffs which depend on overall occupancy in a certain block of the city. The sensors allow the city to automatically adjust pricing following supply and demand. Thus, creating incentives for more evenly spread out parking as well as improving the efficiency of parking overall. Customers will know in advance where parking is available and how much it costs.

It is this prospect of vastly increased efficiency that’s sparking investments in the infrastructure build-outs. IBM is projecting that by 2020, there will be 100b internet-connected devices in the world.

However, one of the most prolific and innovative markets with regard to the Internet of Things right now certainly is the health and fitness market, where movements like “Quantified Self”, have long looked into self-improvement by measuring. Based on Lord Kelvin’s belief that “if you cannot measure it, you cannot improve it” the movement set out to precisely measure huge portions of their everyday to subject it to analysis.

The early tools they worked with were crude: personal data gathered into spreadsheets to graph over time. But once chips became cheap enough to embed them into everything from body scales to step counters, things really took off. It didn’t take long until the sports industry noticed. The Nike+ system was the first to couple a physical sensor object with digital services on a large consumer scale. Now consumers can choose between a wide array of suppliers: from the Withings scale, which will transmit your weight measurements to a server and display them to you in a time graph; to the FitBit step counter to the Jawbone Up, a sensor bracelet which will remind you to exercise regularly—both of which can have their measurements correlated against your weight over time, showing you how effective your workouts are.

The plunging cost of computer chips is not only finding its ways into fitness gizmos. The Arduino development platform was first introduced in 2005 and has powered countless hobby projects. Just this year, the Raspberry Pi was released: a full scale computer, capable of running Linux, but with a price tag of just USD 25, it already spawned a multitude of hobbyist hacks. It’s those development platforms that fuel the latest approaches into citizen sciences and start-up product development.

The crowdfunding platform Kickstarter is awash with sensor platforms that talk to the Internet. Most of them funded within days, they all promise essentially the same thing: packaging available sensors and communications capabilities in an easy-to-use form so that hobbyists can experiment with them.

One such example is the Air Quality Egg. At its core the Air Quality Egg is a development board which hosts multiple sensors, from CO2 and N2O, optical micro-particle sensors and even geiger tubes, those platforms try to enable citizens in making sense of their environment. The Air Quality Egg gives citizens the ability to engage in a discourse about the world they live in. More importantly, it allows them to hold officials accountable to emissions standards and pollution levels.

It is those low-cost development platforms that not only change our notions of what constitutes computation, but enable such movements like Safecast and the ad hoc response to the nuclear emissions in Japan. Ed Borden of Cosm stated in a blog post following the activity they’ve seen in connection with Geiger counter readings in Japan that it “[…] never occurred to [him] that crowdsourcing radiation data from Geiger counters would be an application for Pachube!”

It is that sense of opportunity, of unforeseen applications, that makes the Internet of Things seem like the 1990s. As Andy Huntington of BERG, a design consultancy in London, put it: we’re in the “Geocities of Things” stage. Just like Geocities enabled troves of regular internet users to try their hand at creating websites, and just like with the Internet, before Google and Facebook, we don’t know where this journey will go, yet.

What we do know is that many people are trying many things. Projects like the Good Night Lamp by European Internet of Things champion Alexandra Deschamps-Sonsino try to use this technology to create avenues for ambient awareness and intimacy by signaling whether your loved ones are at home. Projects like Botanicalls measure the humidity in the soil of your house plants, reminding you to water them. The Copenhagen wheel, by the MIT Senseable City Lab, enables you to plan better cycling routes by assembling data of air pollution and vehicle proximity on your rides—and the experiments are getting ever more daring.

Robert Metcalfe, the inventor of the Ethernet—one of the fundamental technologies of the Internet—is namesake for Metcalfe’s Law. This law posits that the value of a network increases exponentially to the number of the nodes, because each new node on a network not only adds one new connection, but connections to every other node on the network. This is fundamental to so-called ‘network effects’ that platforms, like Facebook, harness to protect their businesses. But this law also provides strong incentives to bring more and more applications onto the Internet.

Industry players are looking into increasing the efficiencies in managing their businesses. Science is increasingly deploying sensor networks, ranging from automatic diving robots that map deep-water oceanic currents or measure the CO2 held by the soil in suburban Baltimore. We’re deploying this technology to make sense of the world. To have a more immediate feedback of our actions, to better track our businesses and habits.

But the most interesting features of the ‘Internet of Things’ are being developed by small startup outfits and tinkering hobbyists in hackerspaces. Borne out of need, to realize an idea, or even just to say, like in the days of Geocities, “Hello World, I am here.” It’s those little patches, like the old Geiger counters in the Tokyo Hackerspace, that with new embedded capabilities have the potential to drastically change our interaction with the world.

Electronic devices can all potentially become part of the net: smartphones become bundles of sensors for the network, smart grids will influence our power consumption and its reliability. To us IoT is also a great example of a field where the most interesting innovations are created by bottom-up groups and networks instead of top-down companies and institutions. Seeing the alarming number of fissures manifesting in more and more of those institutions, it’s reassuring to see such vitality in more agile systems — AR

No items found.
No items found.
No items found.
No items found.
No items found.